From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Labview doesnt seem to be re-using memory

I have a high speed hardware timed DAQ program (Labview 6.02) that gathers syncronous counter/timer information for a short period of time. Due to the shear volume of data and the speed at which it is gathered It is buffered and then saved to disk. Upon completion of the acqiusition the data is pulled from the disk and processed (x-Y graphs, FFT etc). If I try and do another acquisition the program doesnt seem to re-use the memory utilised for data and for displaying it in graphs etc. So after a few acquisitions the whole thing can grind to a halt. I have to close the application and Labview and restart it so that the acquisition will run properly and then the things starts all over again. How can I get labview to sort of dump
the data or clear its memory once I have processed the graphs etc. After all the raw data is saved on disk anyway.
0 Kudos
Message 1 of 8
(3,038 Views)
Geoff,

I am not sure exactly what you have going on here, but here are some things to try:

Make sure your graphs have a reasonable history size setting (100,000 or so?) Also, make sure your DAQ is allocating and deallocating properly. There may be a setting that might be overlooked. Since I don't have your code in front of me, I wouldn't even begin to know where to look.

As a last resort (and I do mean LAST RESORT) try going into preferences and selecting "Deallocate memory as soon as possible". This is absolutely your last resort, as it has implications for LabVIEW that I am completely unfamiliar with, and may pose other problems.

Good luck
Message 2 of 8
(3,038 Views)
Does anyone know exactly what the "Deallocate memory as soon as possible" option does? We are also trying this as a last resort, but we have no clue what it is actually doing. For example, will it dump data in uninitialized shift registers? Does it unload VIs from memory (when you would normally expect them to remain in memory)? Or is it just "tidying things up" more quickly than it normally does?

We're using LV 6.0.2 with TS 2.0.1 in a complex test system, and we're trying to fix an apparent memory leak.

Thanks,

Darin
0 Kudos
Message 4 of 8
(3,038 Views)
From what I've been able to determine, the "Deallocate memory.." simply deallocates memory that was used in subvi's. Once the subvi exits, the memory is freed. The exact mechanics of this are a bit mysterious, for instance, what happens if the subvi is in a loop? Is the system smart enough to wait until the last iteration before freeing memory?

I work with very large data sets, and typically work with the deallocation turned on. It's a bit painful at first, but better than dropping several hundred megabytes onto the swap disk.
0 Kudos
Message 5 of 8
(3,038 Views)
"Darin" wrote in message
news:506500000005000000DB720000-1019262487000@exchange.ni.com...
> Does anyone know exactly what the "Deallocate memory as soon as
> possible" option does? We are also trying this as a last resort, but
> we have no clue what it is actually doing. For example, will it dump
> data in uninitialized shift registers? Does it unload VIs from memory
> (when you would normally expect them to remain in memory)? Or is it
> just "tidying things up" more quickly than it normally does?
>
> We're using LV 6.0.2 with TS 2.0.1 in a complex test system, and we're
> trying to fix an apparent memory leak.

One hard thing about trying to decide if you have a memory leak or not, is
that LV seems to cache the SubVI code some
how. The reference posted is a
good read. I'm still not positive what happens with that memory, but perfmon
always lists LV as using lots of memory when even a simple data-flow program
is done executing. The explanation I was given is that the subVIs are cached
to let it execute faster in the future. Harder to trace leaks because from
the outside LV looks large.

One possibility? I guess try profiler and see if you can spot where the
memory is going.

-joey
0 Kudos
Message 7 of 8
(3,038 Views)
> Does anyone know exactly what the "Deallocate memory as soon as
> possible" option does? We are also trying this as a last resort, but
> we have no clue what it is actually doing. For example, will it dump
> data in uninitialized shift registers? Does it unload VIs from memory
> (when you would normally expect them to remain in memory)? Or is it
> just "tidying things up" more quickly than it normally does?
>


This option means that after each subVI call, even those inside of
loops, LV will deallocate all temporary memory buffers. It has no
effect on unwired shift registers, controls and indicators or on the
results of the subVI. Without this option turned on, LV will leave the
buffers allocated waiting for the subVI to be called
again. So, this is
the typical space versus time tradeoff. This has no effect on which VIs
are loaded or unloaded.

Now that you know what it does, this feature can lead to fragmentation
of the memory manager since it so frequently allocates and deallocates
the blocks of memory. There is nothing wrong with the feature, but most
memory managers can't deal with this and with certain VIs, it can result
in more memory being used than when it is turned off.

Often the "memory problem" is better solved by getting rid of global
variables and local variables and making more subVIs. It is hard to get
more specific, but the user manual used to have a chapter on performance
that explained some of it.

Greg McKaskle
0 Kudos
Message 8 of 8
(3,038 Views)
> I have a high speed hardware timed DAQ program (Labview 6.02) that
> gathers syncronous counter/timer information for a short period of
> time. Due to the shear volume of data and the speed at which it is
> gathered It is buffered and then saved to disk. Upon completion of the
> acqiusition the data is pulled from the disk and processed (x-Y
> graphs, FFT etc). If I try and do another acquisition the program
> doesnt seem to re-use the memory utilised for data and for displaying
> it in graphs etc. So after a few acquisitions the whole thing can
> grind to a halt. I have to close the application and Labview and
> restart it so that the acquisition will run properly and then the
> things starts all over again. How can I get labview to sort of dump
> the dat
a or clear its memory once I have processed the graphs etc.
> After all the raw data is saved on disk anyway.
>

If you have shift registers or globals that are accumulating all of the
data, then you will want to initialize them periodically to free the
data. For charts, there is a property called History. Writing a
smaller array, like an empty array to it, you can clear the charts.

Finally, you might want to open the profiler, found under
Tools>>Advanced in LV6, turn on the memory profiling and run the app to
see if you can tell where the data is accumulating.

Greg McKaskle
0 Kudos
Message 3 of 8
(3,038 Views)
A moderately good reference for how LabVIEW handles memory is LabVIEW Performace and Memory Management, also available by searching for Application Note 168 on the NI web site. It's not going to keep anyone awake at night, but it does have good information on optimizing LabVIEW programs.
0 Kudos
Message 6 of 8
(3,038 Views)