LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

memory full

Hi Om:
 
I worked with waveforms until I discovered that under "append waveforms DLB.VI" there was a "Build Array.vi".
Since then I just use them only if a VI needs a waveform as an input. The rest of times I use arrays.
 
In my opinion, Waveforms are just arrays with extra information, but the VIs to handle them are not optimized in memory (at least not all).
So, for small applications are a very fast and easy solution, but not for a complex one.
 
If you need that big array, you can reserve that memory from the beginning. Every partial result should replace part of this array.
 
For the process, if you use a certain amount of data, and always the same amount, with different data in it, you can use that same space of memory every loop with shift registers
And the result (waveform) can be converted to a part of the initialiy reserveded array (unbundle the waveform and you have an array).
 
Note that "t0" and "dt" values are no use for you, as t0 will be 0, or n * [size of array] / [Sampling Frequency], and dt will be the same for all.
 
Hope it helps,
Aitortxo.
0 Kudos
Message 11 of 13
(988 Views)
What you are trying to do can probably be done, but you will have to be very careful.  A 200MByte wire can quickly run you out of memory with just a few copies, which are very easy to generate.  I used to routinely work with data sets of this size, so don't get discouraged.  Read the article on large memory management, work through the examples, and you should be good to go.  If you continue having problems, post your code and we can offer more concrete suggestions for your application.

Note that a practical limit on memory use in LabVIEW on WinXP is about 1.5GBytes, provided you do everything right.  Depending on which LabVIEW version you are using, the maximum data you can have in a single array is somewhere between 800MBytes and a bit over 1GByte, due to memory fragmentation issues.  I managed a 1.1GByte array in LV7.1 once, with nothing else running on my machine.
Message 12 of 13
(970 Views)
I have a similar problem which requires to load 500M data into memory and access them reguarly.  Do you mind giving some suggestion?

The poster link:
http://forums.ni.com/ni/board/message?board.id=170&message.id=286763&jump=true#M286763

0 Kudos
Message 13 of 13
(840 Views)