LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Reading very large (193 GB) binary file

Solved!
Go to solution

How much free space do you have on your hard-drive?

 

With Virtual memory(which is dog slow), you can work with memory spaces that are larger than the physical memory but you will be limited to the max amount of swap space that can be allocated on the hard-drive.

 

Ben

Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
0 Kudos
Message 11 of 24
(1,710 Views)

So what happens if you set the Read Length to "1"?

 

/Y

G# - Award winning reference based OOP for LV, for free! - Qestit VIPM GitHub

Qestit Systems
Certified-LabVIEW-Developer
0 Kudos
Message 12 of 24
(1,696 Views)

This is the first time I have used this forum for help, and I apologize for not following “protocol.”  In particular, I apologize for not posting the complete set of vi’s involved, but getting permission to do so would not be trivial (national laboratories can be that way), and I don’t know that it would really help.  To reiterate the key points (and clear up some confusion I may have created about just where the problem is occurring): I have no difficulty with “smaller” binary files (at least 56 GB in size), so I am convinced I am not doing anything intrinsically wrong with binary file I/O.  I am able to successfully open (using Open/Create/Replace File) the “large” file (at least 193 GB), and read its size (296379234836 using Get File Size).  The failure (generating error 116) occurs when I first try to read it (using Read from Binary File).  If I use a (non-LabVIEW) utility to carve off a moderate-size chunk (512 MB) from the beginning of the file, I AM able to read in and process the ~40-MB binary segments in that chunk, so there is nothing wrong with the first part of the file (or anything recorded in it that causes a problem).  So the “Read from Binary File” vi is for some reason unhappy with the size of the file, and it immediately returns the error code.  Other than carving the file up into pieces that include the number of entries in each piece (something I believe one of my colleagues can do), I’m hoping for some easier work-around.

A thought related to a response from Kyle97330 about the number of array elements.  I am reading in a binary array.  For each “segment” (i.e. each time I use Read from Binary File), the maximum number of elements is less than 2 million.  However, IF Read from Binary File is not getting the actual number of elements from the segment header, but rather is calculating the number of array elements in the entire file, it would (as Kyle97330 pointed out) exceed the number that could be represented by a 32-bit number.  Might that be the problem?  (I have been assuming that, for each read, it only cares about the number in the segment header.)

0 Kudos
Message 13 of 24
(1,653 Views)

 

 

As a quick test, try reading the file as a byte array. As you have it set-up, LV will try to convert the data into an array of the type you specified and if the entire array will not fit into memory... it will be truncated.

 

If the byte array read works you COULD (admit it will not be trivial) read chunks of the file and cast the chunks to the cluster bit by bit.

 

Ben

Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
0 Kudos
Message 14 of 24
(1,646 Views)

My thought was that if you set the read length to 1, you'll get 1 of your arrays (or possibly 1 element), if that works, you can read it in chunks of 10 or 1000 and work from there.

/Y

G# - Award winning reference based OOP for LV, for free! - Qestit VIPM GitHub

Qestit Systems
Certified-LabVIEW-Developer
Message 15 of 24
(1,641 Views)

The vi is not executable if I wire 1 (or any other value) "count."  From the "help" information, it sounds like the value should specify the number of instances of "data type" (in my case an array) to read, but LabVIEW doesn't like it.

0 Kudos
Message 16 of 24
(1,633 Views)

@jgold47 wrote:

The vi is not executable if I wire 1 (or any other value) "count."  From the "help" information, it sounds like the value should specify the number of instances of "data type" (in my case an array) to read, but LabVIEW doesn't like it.


Have you tried wiring a cluster rather than an array of clusters?

 

Ben

Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
0 Kudos
Message 17 of 24
(1,631 Views)

@jgold47 wrote:

The vi is not executable if I wire 1 (or any other value) "count."  From the "help" information, it sounds like the value should specify the number of instances of "data type" (in my case an array) to read, but LabVIEW doesn't like it.


Try 2.

0 Kudos
Message 18 of 24
(1,628 Views)

Wiring a cluster rather than the array of clusters, yes, I am able to open and read from the large file.  This will require other changes "downstream," but it might solve my problem.  (In response to another reply that popped up, yes, I had previously tried wiring values other than 1 to "count.")

0 Kudos
Message 19 of 24
(1,621 Views)

@jgold47 wrote:

Wiring a cluster rather than the array of clusters, yes, I am able to open and read from the large file.  This will require other changes "downstream," but it might solve my problem.  (In response to another reply that popped up, yes, I had previously tried wiring values other than 1 to "count.")


Thank you for that update.

 

The post mortem on this question reduces to...

 

The error code "116' was telling us that it could not read all of the clusters into memory at one time. By switching over to reading a single or a group of cluster at a time, we can acess the contents of the file... as far as we know.

 

Ben

Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
Message 20 of 24
(1,619 Views)