LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Reading very large (193 GB) binary file

Solved!
Go to solution

@jgold47 wrote:

Wiring a cluster rather than the array of clusters, yes, I am able to open and read from the large file.  This will require other changes "downstream," but it might solve my problem.  (In response to another reply that popped up, yes, I had previously tried wiring values other than 1 to "count.")


Excellent! Then read 10^6 clusters at a time and analyze chunks. (working with single elements will probably be slow)

/Y

G# - Award winning reference based OOP for LV, for free! ADDQ VIPM Now on GitHub
"Only dead fish swim downstream" - "My life for Kudos!" - "Dumb people repeat old mistakes - smart ones create new ones."
Certified-LabVIEW-Developer
0 Kudos
Message 21 of 23
(734 Views)

@jgold47 wrote:

Wiring a cluster rather than the array of clusters, yes, I am able to open and read from the large file.  This will require other changes "downstream," but it might solve my problem.  (In response to another reply that popped up, yes, I had previously tried wiring values other than 1 to "count.")


It may also solve a problem that you have yet to find!

 

I also had to handle a multi-Gigabyte file challenge and the lareg amount of data presented a performance challenge just processing all of the data. I ended up using a chain of Producer-Consumers such that...

 

Loop1

Looks for CR in text file and aps record to loop2

 

Loop2

formats the data in the record as cluster and pass to loop3

 

Loop3

Checks timestamp to see if it is in the desired time range and if so pass to loop4

 

Loop4

Applies the new set of reading to the GUI

 

This let me set loose multiple CPU and update the GUI on a regular schedule.

 

So embrace the change!

 

It may be good.

 

Ben

Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
Message 22 of 23
(710 Views)
Solution
Accepted by topic author jgold47

Time to close this out, even though I still don't have an answer to the real question: Why does the file size matter?

 

I am able to open the very large binary file; the error occurs when I try to read it the first time.  It is not the format of the file, as I can read the first "chunks" of data I saved if I use a utility to carve off a relatively small section of the front of the file.  Thus it is the file size itself.  For some reason, the binary read vi looks at the size of the file and says "no way," even though I'm only trying to read in a manageable chunk at a time.

 

But I don't see this discussion as getting me much further, so as I say, time to close this out.

0 Kudos
Message 23 of 23
(673 Views)