LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Reading very large (193 GB) binary file

Solved!
Go to solution

Using LabVIEW 2013 on a 64-bit Windows 7 system: I have a data acquisition system that saves chunks of data to a binary file every 10 seconds.  For intermediate-duration data runs (producing a binary data file up to 56 GB in size), I have no difficulty subsequently opening the file and reading the binary data in order to process it.  However, when I try to use the same code to access a larger file (193 GB), the "Read from Binary File" vi immediately produces error code 116.  Is there a file-size limit between 56 GB and 193 GB that I'm running into?  Anything I can do to successfully process this file?  (I can use a utility to separate it into smaller files, and then I can process the first file, the the subsequent files don't "start in the right place" for my binary data structure.)

0 Kudos
Message 1 of 24
(5,271 Views)

Irritating arbitrary limits on computers usually happen at powers of two, so if 56 is OK but 193 is not, there's a good chance that the problem starts at either 64 or 128 GB.

 

Is there a chance that LabVIEW is trying to create an array that has a length over 2^31-1 (2.l billion and change), i.e. the max size of an I32?  Since I32s are what it uses for array indexes, this could be an issue.

 

What's the size of each element that you're trying to save to the binary file?

 

Can you post the VIs that cause the problem?  You could have something else going on too that would cause the issue.

0 Kudos
Message 2 of 24
(5,253 Views)

I don't see why LabVIEW would be trying to create an enormous array.  Even for the "smaller" file (56 GB), there wouldn't be room for everything.  On the other hand, the error condition does happen immediately when I try to open the larger file (not after it's been reading elements in for a while), so it is evidently making some decision based on the size of the file.

For my higher-data-rate situation, each 10-second interval writes about 41 MB to the binary file (consisting of about 1 million complex data elements).

I said in my original posting that I'm running a 64-bit system.  I am, but "About LabVIEW" shows that I'm actually running a 32-bit version lf LabVIEW 2013 (Service Pack 1).

0 Kudos
Message 3 of 24
(5,237 Views)

1. Post your vi!

2. If possible, do as much preprocessing as possible on-the-fly. It will save a lot of trouble.

3. Exactly how do you "open the file"? If you are reading the entire file contents into some structure you will run into problems. There are better ways.

4. If you adapt your acquisition system to do it from the start you should be able to save smaller files that conforms to the right structure.

0 Kudos
Message 4 of 24
(5,231 Views)

I need to see your code to see if it is indeed the case, as I don't know what you have got wired into the "Read Binary File" function.

 

Here's what might cause your problem:

 

Open binary file.png

Message 5 of 24
(5,224 Views)

The attachment shows the relevant snippets of code.  Nothing special.  I open a file, do repeated binary writes of a large array (structure shown on bottom left; for each write, "prepend array or string size?" by default being true means the size of that array is included), and except for the very large file, I have no problem reading in that structure.  The snippet on bottom right shows how I open the file (generating error 116 for the very large file).

 

0 Kudos
Message 6 of 24
(5,210 Views)

Without actual code it's very hard to know for sure, but I see an array of a cluster of 10 elements wired into the top.  Yes?

 

I'm going to go ahead and guess that they're all 32 bit integers, but again, it's hard to see to know for sure.  But 10 elements at 4 bytes each makes each element 40 bytes.

 

That means that as soon as you try to read a file of a size over ((2^31)-1) * 40 bytes, you'll run out of array indexes.  That's 85899345880 bytes, or 80 gigabytes.  Nicely in between the values that do and don't work.

 

I suggest that you revisit the method of splitting files into smaller files and just make sure you slice them evenly on a multiple of 40 bytes (or whatever size your cluster is, if some of the numbers in there are not I32 numbers).

 

 

0 Kudos
Message 7 of 24
(5,189 Views)

Interesting!  Since I'm only reading in ~40 MB at a time for ~1 million array elements, I didn't think that the file size would matter.  But IF LabVIEW is trying to deal with the size of the ENTIRE array when I open the file (even though I don't read it in that way), I can see that the total number of array elements in the file could cause the problem.  I agree that saving the data in smaller-size file chunks is the right approach, but unfortunately I can't do that for the measurements I performed earlier this week (I had never generated such large files before, so the problem had not cropped up).

0 Kudos
Message 8 of 24
(5,175 Views)

You are having a problem, we are willing to help, but you (who have the problem) choose what you think are the "relevant portions of the code" and send us a picture of the code.

 

If you've spent any time on the Forum, you would see that we can be much more helpful when you post your VI, as it, warts and all, so that we can (a) inspect all of the code, (b) look at all of the Cases of Case statements, (c) edit (and sometimes "clean up") the code, and (d) test the code.  We can also tell which version of LabVIEW you are using, sometimes an important consideration.

 

You can certainly write out a binary file that is >100GB in size, especially if you are not writing it "all at once" (which you can't do because you can't hold all that much in memory using 32-bit LabVIEW, maybe a few GB).  Similarly, if you know the format of the binary file (which you have to know in order to read it), you can read it back a piece at a time, and as long as you don't need to keep all of the contents in memory, you should have no problems.

 

Is this "project" maintained inside a LabVIEW Project (i.e. do you have a .lvproj file, and have all of the VIs and controls grouped in a single Folder on your hard drive)?  In that case, I recommend that you compress the folder containing your Project and, instead of attaching the VI that's going astray, attach the resulting .ZIP file holding the entire Project (be sure to identify (a) the VI whose "picture fragments" you showed and (b) the Top Level VI).  One of us might undertake to simulate your situation and show you how you can Have Your 100GB Cake and Analyze It, Too.

 

Bob Schor

0 Kudos
Message 9 of 24
(5,122 Views)

jgold47>> I am not sure it will make any difference in this case, but I think you should make a habit to use "Preallocated read from binary file" when reading large files in several iterations.

 

0 Kudos
Message 10 of 24
(5,100 Views)