LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

free variable memory

Dear
 
I am writing some program to do data aquisition.
After calculating some data(sweeping voltages) in array type(using Build array), data was taken according to the voltages.
The array size is changeable by the number of data.
During this work, I used "WHILE" loop for continuous works.
 
According to Labview style guide, "If possible, do not build arrays using the Build Array function within
a loop because the function makes repetitive calls to the LabVIEW
memory manager. A more efficient method of building an array is to
use auto-indexing or to pre-size the array and use the Replace Array
Element function to place values in it. Similar issues exist when
dealing with strings because, in memory, LabVIEW handles strings as
arrays of characters."
 
Should I insert some routines to free the array in the end of one loop?
Could you please give me some typical examples of the guidline in extreme cases?
 
Thank you in advance. 
0 Kudos
Message 1 of 5
(2,877 Views)

labmaster,

I'm not sure you understand the reason for the inefficiency of functions like Build Array. Inserting some function at the end of a loop will not make it any more efficient because the problem is that Build Array will still be called every iteration of the loop and the array will be resized at that point. So functions after the build array will have no effect. The best thing to do is to pre-allocate the array somehow. For example, in your case you don't know how big the array needs to be, but one possible solution would be to preallocate an array that you are sure would have enough space to hold the data points. Sure there may be elements that are not used, but LabVIEW wouldn't have to resize the memory space for the array every iteration of the loop. Here is a tutorial that talks about maximizing the efficiency of a LabVIEW program. About 2/3 of the way down it talks about how to avoid constantly resizing data and walks through an example with the build array function. This should help you out some. Good luck and have a great day!

Tyler S.

0 Kudos
Message 2 of 5
(2,843 Views)
The way I understand array memory management stems from how Labview automatically manages memory.  To "build an array" Labview dynamically allocates enough memory to store the data from the impute of build array, copies all the data, and then frees the unused memory as needed automatically (would be a real pain in c but too easy in LV).  This is very different from autoindexing or replacing array elements which do not require dynamic allocation of new memory but write to a preallocated location.  The advantage of dynamically building arrays are that it is very flexible in not requiring to know the size of the array you will be writing to.  With replace array element or autoindexing you are simply replacing index elements inside a for loop of an already allocated memory block using an index reference, this is fast and the ideal in memory efficiency.  Again the downfall is that the array size must be fixed before processing the data.  This is not a problem if you are using small arrays but if you plan to handle huge arrays with millions of data points, then you should use autoindexing or element replace functions (only if speed and memory are of concern to you - and I am sure they are).  I try to use autoindexing when ever I can, but sometimes, build array are necessary.
 
Paul
     
Paul Falkenstein
Coleman Technologies Inc.
CLA, CPI, AIA-Vision
Labview 4.0- 2013, RT, Vision, FPGA
0 Kudos
Message 3 of 5
(2,836 Views)
I tested three cases in the same vi for creating an 1,000,000 element array with the index vadue i at the [i] index:
case 1: using auto indexing
case 2: using preallocate initialize and and replace element
case 3: using build array.
 
The results for execution speed are:
case 1: 17ms (fastest and best method for fixed size arrays).
case 2: 24ms (slightlt slower because of array initialization to all value = 0)
case3: 3471ms (more than 100x slower but needed for cases where data set is small enough and of unknown size)
 
This shows how for even moderately large data sets you will no longer be able to keep up with any real-time processing of data with bad algorithms
I didnot include memory useage but the use is probably even worse than processor overhead hits.
See the attached vi to try it for yourself.
 
Paul
 
Paul Falkenstein
Coleman Technologies Inc.
CLA, CPI, AIA-Vision
Labview 4.0- 2013, RT, Vision, FPGA
0 Kudos
Message 4 of 5
(2,834 Views)

One more point, if you are doing the array processing in a subvi and need only temporary use (while the subvi is opened) you can have the array explicitly removed from memory using Advanced->Data Manipulation->Memory Deallocation.  Good Idea if you are making many repetative calls to a processing subvi which manipulated data in an array

Paul

Paul Falkenstein
Coleman Technologies Inc.
CLA, CPI, AIA-Vision
Labview 4.0- 2013, RT, Vision, FPGA
0 Kudos
Message 5 of 5
(2,832 Views)