From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

1GB memory limit?

We are currently using a home written dll that we call from within Labview to acquire and analyze images from a framegrabber. Up to now the output of this dll was a 2-D array with  1 million I32 elements. This dll allocates about 600 MB of memory. Recently we have modified our dll and get 2D arrays with up to 16 Million  elements. This works just fine at a firts run. When the dll returns to labview the total amount of memory allocated by the vi is of the order of 700 MB. However if we try to write this data to disk in binary format, the memory allocated sometimes shoots up to over 1 GB (apperently Labview creates in this process many copies of this  data). If we  try to call the dll again we get an error message that there is not eneough memery. The strange thing is that this only happens if we go over the 1GB limit, 980 MB for example is still ok. We have tried to change the amount of memory in the PC but this has no influence. Currently we use 2GB but still this 1 GB limit excist. Has anybody an idea whata causes this 1GB bottleneck and how to eliminate it?
0 Kudos
Message 1 of 7
(3,160 Views)
 
hi there,
 
yes, maybe the code produces copies of data. try to use
 
- Tools->Advanced->Show Buffer allocations .... (not in base package!) from the menu bar
- step through your code with the debugger while observing the task manager and find the place where the memory explodes.
- Advanced->Data manipulation->Request Deallocation from the functions palette
- or post your code here. 
 
Best regards
chris

CL(A)Dly bending G-Force with LabVIEW

famous last words: "oh my god, it is full of stars!"
0 Kudos
Message 2 of 7
(3,120 Views)
Yes, checking for extra buffer allocations is the first thing to try. What tools do you use to write the binary data to disk? Maybe there is a bettter way.
0 Kudos
Message 3 of 7
(3,111 Views)

Thanks for the suggestions. I have looked at the memory usage and it increases when saving to disk using the write to I16 file.vi (after the conversion from I32 to I16). In trying to remedy this I have tried deallocating memory, this frees up a few MB but no more than that. And although this blowup is anoying and reducing it would be helpfull (esspecially since we plan to use even bigger arrays), the main problem I have is that I can no longer allocate memery in my dll once Labview has passed the 1GB limit (even if it deallocates this memory after some time). Any suggestions here?

0 Kudos
Message 4 of 7
(3,078 Views)

You are running into a fundamental problem. LabVIEW allocates single arrays as contiguous memory. If your memory is fragmented, and most memory is, you will be unable to use a single array with more data than the largest open memory block in your RAM. With LabVIEW 7.1, this is a shade over 1GByte. Throw in the fact that Windows uses most of the high 512MBytes of the virtual 2GByte limit for system DLLs, and your practical limit, even if you could use everything, is closer to 1.5GBytes.

That said, however, your problem is still solveable. First, read the tutorial Managing Large Data Sets in LabVIEW. The information you need to solve your problem should be there, with code examples. I have successfully handled gigabytes of data using LabVIEW. It is not trivial, but it is possible.

For your particular application, you should probably try things in the following order.

  1. Use a LV2 global or queue global(s) to avoid data copies.
  2. Allocate your image data in stripes to avoid the continguous memory problem. Take the time to write a nice interface to it so you don't have to keep thinking about how to get a piece into or out of it.
  3. If this does not give you enough memory, use a disk buffer. I would recommend HDF5 as the underlying technology. This is what NI uses in their waveform editors. See Can I Edit and Create Hierarchical Data Format (HDF5) files in LabVIEW?. HDF5 has a really steep learning curve, but it's worth the effort.

Good Luck! If you have any more problems or need some more tips, let us know.

0 Kudos
Message 5 of 7
(3,057 Views)


@MarcelD wrote:
...(after the conversion from I32 to I16)...

So, if you only need I16 data, why do you even have it in memory as I32?? Maybe the I32->I16 allocates extra memory, pushing you over the limit? Have you tried to write I32 binary file instead, avoiding the conversion?

Use only low-level file IO. You don't need the fancy options of the binary file VIs (open checking, choice between 1D and 2D arrays, etc).

Since your memory is OK until you write the data to disk, this alone might solve it.

0 Kudos
Message 6 of 7
(3,038 Views)
Anyone played with editbin.exe and /largeaddressaware switch?? I heard about this in a yahoo group. Supposedly on a 64-bit windows version, you can patch an executable to access more memory. I think you need some Microsoft development software to have the editbin.exe program.
 
 
 

This option edits the image to indicate that the application can handle addresses larger than 2 gigabytes.

http://msdn2.microsoft.com/en-us/library/203797te(VS.80).aspx

http://gid.cimne.upc.es/support/gid3gb/index.html

0 Kudos
Message 7 of 7
(2,823 Views)