LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Memory Leak?

I have a VI which consists of a loop that basically sets up a buffer,
sets up an analog trigger, then scans in 120 ms of data when the trigger
occurs. I read the data out of memory and display it on a graph, use
AI Clear to clear the acquisition, then perform the loop all over again.
After a few days, Windows NT starts showing memory errors, and when I
check what is hogging the memory, it's LabVIEW. If I shut down LabVIEW
and start over, it starts up at about 17 MB of memory, then steadily
increases over a couple of days to 110 MB until I get warnings about low
virtual memory, etc.

Any ideas what could be causing the problem? (The
"Edit->Preferences->Performance and Disk->Deallocate memory as soon as
possible" is checked, and I am runni
ng with multiple threads)

Thanks,

Mark


Sent via Deja.com http://www.deja.com/
Before you buy.
0 Kudos
Message 1 of 14
(4,374 Views)
wrote in message
news:8nh5m0$7hc$1@nnrp1.deja.com...
> I have a VI which consists of a loop that basically sets up a buffer,
> sets up an analog trigger, then scans in 120 ms of data when the trigger
> occurs. I read the data out of memory and display it on a graph, use
> AI Clear to clear the acquisition, then perform the loop all over again.
> After a few days, Windows NT starts showing memory errors, and when I
> check what is hogging the memory, it's LabVIEW. If I shut down LabVIEW
> and start over, it starts up at about 17 MB of memory, then steadily
> increases over a couple of days to 110 MB until I get warnings about low
> virtual memory, etc.
>
> Any ideas what could be causing the problem? (The
> "Edit->Preferences->Performance
and Disk->Deallocate memory as soon as
> possible" is checked, and I am running with multiple threads)

You're definitely not losing memory yourself through arrays? If you check
the memory usage of the VI using "VI Info" it should show how much memory is
allocated to the VI. You seem to be suggesting that you have to shut down
Labview and allow the system- rather than Labview- to purge the memory, in
which case the "VI Info" box should show nothing out of the ordinary. If the
box DOES show a huge memory allocation, then it indicates Labview knows the
memory is allocated and who owns it, and the problem is in your code rather
than in Labview's guts. Similarly, selectively closing VIs will cause their
memory to be freed.

File handles etc can also consume memory, and seem to only be freed when all
VIs stop executing- not simply the VI that opened the handle. I had a
problem once with this when I was opening a TCP connection every few seconds
and forgetting to close it- symptoms over a f
ew days are similar to what you
describe. What do you do with the data after displaying it?
0 Kudos
Message 2 of 14
(4,373 Views)
Craig Graham wrote:

> wrote in message
> news:8nh5m0$7hc$1@nnrp1.deja.com...
>> I have a VI which consists of a loop that basically sets up a buffer,
>> sets up an analog trigger, then scans in 120 ms of data when the trigger
>> occurs. I read the data out of memory and display it on a graph, use
>> AI Clear to clear the acquisition, then perform the loop all over again.
>> After a few days, Windows NT starts showing memory errors, and when I
>> check what is hogging the memory, it's LabVIEW. If I shut down LabVIEW
>> and start over, it starts up at about 17 MB of memory, then steadily
>> increases over a couple of days to 110 MB until I get warnings about low
>> virtual memory, etc.
>>
>> Any ideas what could be causing the problem? (The
>> "Edit->Preferences->Performance and Disk->Deallocate memory as soon as
>> possible" is checked, and I am running with multiple threads)

> You're definitely not losing memory yourself through arrays? If you check
> the memory usage of the VI using "VI Info" it should show how much memory is
> allocated to the VI. You seem to be suggesting that you have to shut down
> Labview and allow the system- rather than Labview- to purge the memory, in
> which case the "VI Info" box should show nothing out of the ordinary. If the
> box DOES show a huge memory allocation, then it indicates Labview knows the
> memory is allocated and who owns it, and the problem is in your code rather
> than in Labview's guts. Similarly, selectively closing VIs will cause their
> memory to be freed.

> File handles etc can also consume memory, and seem to only be freed when all
> VIs stop executing- not simply the VI that opened the handle. I had a
> problem once with this when I was opening a TCP connection every few seconds
> and forgetting to close it- symptoms over a few days are similar to what you
> describe. What do you do with the data after displaying it?

I've got a similar problem (albeit much much worse) with a large
project that does extensive vi-server, queue and notifier
handling. After several hours of the usage it's eaten up most of the
available memory. The delay comes when the program is stopped -
LabVIEW just sits at 100% processor and takes up to 45 minutes to
recover. Interestingly enough Win9x doesn't have this problem and will
stop quickly.

Unfortunately I've not been able to replicate the problem is a less
hideously complex piece of code.
--
******************************************************************************
Gavin Burnell Dept. Materials Science and Metallurgy
Device Materials Group University of Cambridge, UK
******************************************************************************
--
Gavin Burnell
Condensed Matter Physics Group, University of Leeds, UK
http://www.stoner.leeds.ac.uk/
0 Kudos
Message 3 of 14
(4,373 Views)
In article <8njcc3$lf3$1@pegasus.csx.cam.ac.uk>, gb119@cus.cam.ac.uk (Gavin
Burnell) wrote:

> I've got a similar problem (albeit much much worse) with a large
> project that does extensive vi-server, queue and notifier
> handling. After several hours of the usage it's eaten up most of the
> available memory. The delay comes when the program is stopped -
> LabVIEW just sits at 100% processor and takes up to 45 minutes to
> recover. Interestingly enough Win9x doesn't have this problem and will
> stop quickly.


I suspect that you have a VI Reference leak in your code.

If you open a VI Ref (using Open VI Reference) and do not release it (with
Close VI Reference) LabVIEW will clean up and release it for you when your
program stops. In the process of releas
ing a VI Ref, LabVIEW must check all
other references to the same VI to see whether there are still any reasons
for keeping around various resources that the VI may have allocated. This
is, unfortunately, an O(n^2) algorithm. (The n^2 part of the cleanup was new
to LV 5.1 and remains in LV 6i.)

If you are opening VI Refs in a loop and failing to close them with each
iteration, you are allocating a new VI Ref each time through the loop,
making n a very large number, which in turn makes n^2 a nastily large
number.

Just close each reference at the end of each iteration of the loop.

Or, better yet, open the reference once outside the loop, use the same
reference during each iteration, and close it outside once the loop has
completed. This not only avoids large n for the O(n^2) cleanup, but avoids
the overhead of multiple Open VI Reference calls, which, if it must load the
VI from disk, can be significant.

--
Rob Dye
LabVIEW developer
rob dot dye at ni dot com

----------
0 Kudos
Message 5 of 14
(4,369 Views)
Rob Dye wrote:
> In article <8njcc3$lf3$1@pegasus.csx.cam.ac.uk>, gb119@cus.cam.ac.uk (Gavin
> Burnell) wrote:

>> I've got a similar problem (albeit much much worse) with a large
>> project that does extensive vi-server, queue and notifier
>> handling. After several hours of the usage it's eaten up most of the
>> available memory. The delay comes when the program is stopped -
>> LabVIEW just sits at 100% processor and takes up to 45 minutes to
>> recover. Interestingly enough Win9x doesn't have this problem and will
>> stop quickly.


> I suspect that you have a VI Reference leak in your code.

Yes, this was the first place I looked 🙂

> If you open a VI Ref (using Open VI Reference) and do not release it (with
> Close VI Reference) LabVIEW will clean up and release it for you when your
> program stops. In the process of releasing a VI Ref, LabVIEW must check all
> other references to the same VI to see whether there are still any reasons
> for keeping around various resources that the VI may have allocated. This
> is, unfortunately, an O(n^2) algorithm. (The n^2 part of the cleanup was new
> to LV 5.1 and remains in LV 6i.)

> If you are opening VI Refs in a loop and failing to close them with each
> iteration, you are allocating a new VI Ref each time through the loop,
> making n a very large number, which in turn makes n^2 a nastily large
> number.

> Just close each reference at the end of each iteration of the loop.

> Or, better yet, open the reference once outside the loop, use the same
> reference during each iteration, and close it outside once the loop has
> completed. This not only avoids large n for the O(n^2) cleanup, but avoids
> the overhead of multiple Open VI Reference calls, which, if it must load the
> VI from disk, can be significant.

Actually I'm using a separate vi that runs in parallel with the main
vi and communicates via queues to do all the vi reference openning and
closing, so I can be sure that I'm only openning vi-references once
and closing them afterwards (this is because dynamically run vis
call other vis dynamically and if I didn't move the vi ref handling to
a separate vi I would spend all my time loading + unloading
vis). Actually I see a large delay after I have closed all the open vi
references, when I would have thought that any vi-references left
floating around would all be invalid so shouldn't be involved in the
garbage collection. Smaller test programs which use the same
architecture to manage the vi-refs don't have the problem and neither
does the same program on a Win9x platform. Also this problem has been
around since LV 5.0 so I don't think it is the n^2 cleanup if I read
your post correctly.
--
******************************************************************************
Gavin Burnell Dept. Materials Science and Metallurgy Device Materials
Group University of Cambridge, UK
******************************************************************************
--
Gavin Burnell
Condensed Matter Physics Group, University of Leeds, UK
http://www.stoner.leeds.ac.uk/
0 Kudos
Message 12 of 14
(4,079 Views)
Forgot to answer this in my last posting! Sorry about that. I am 99%
sure I am not losing it in arrays. I am not using any shift registers
to contain data, or have any arrays increasing with size over time.
Basically, I have the array of data which comes from my AI read, and I
display that on a graph. That's it. That was the first idea that I
had, and I am fairly certain that the arrays aren't the problem. I've
even watched their size (with indicators) to make sure they weren't
increasing with size during loop iterations.

Mark

In article <8nhccc$ebs$1@sponge.lancs.ac.uk>,
"Craig Graham" wrote:
>
> wrote in message
> news:8nh5m0$7hc$1@nnrp1.deja.com...
> > I have a VI which consists of a loop that basically sets up a
buffer,
> > sets up an analog trigger, then scans in 120 ms of data when the
trigger
> > occurs. I read the data out of memory and display it on a graph,
use
> > AI Clear to clear the acquisition, then perform the loop all over
again.
> > After a few days, Windows NT starts showing memory errors, and when
I
> > check what is hogging the memory, it's LabVIEW. If I shut down
LabVIEW
> > and start over, it starts up at about 17 MB of memory, then steadily
> > increases over a couple of days to 110 MB until I get warnings about
low
> > virtual memory, etc.
> >
> > Any ideas what could be causing the problem? (The
> > "Edit->Preferences->Performance and Disk->Deallocate memory as soon
as
> > possible" is checked, and I am running with multiple threads)
>
> You're definitely not losing memory yourself through arrays? If you
check
> the memory usage of the VI using "VI Info" it should show how much
memory is
> allocated to the VI. You seem to be suggesting that you have to shut
down
> Labview and allow the system- rather than Labview- to purge the
memory, in
> which case the "VI Info" box should show nothing out of the ordinary.
If the
> box DOES show a huge memory allocation, then it indicates Labview
knows the
> memory is allocated and who owns it, and the problem is in your code
rather
> than in Labview's guts. Similarly, selectively closing VIs will cause
their
> memory to be freed.
>
> File handles etc can also consume memory, and seem to only be freed
when all
> VIs stop executing- not simply the VI that opened the handle. I had a
> problem once with this when I was opening a TCP connection every few
seconds
> and forgetting to close it- symptoms over a few days are similar to
what you
> describe. What do you do with the data after displaying it?
>
>


Sent via Deja.com http://www.deja.com/
Before you buy.
0 Kudos
Message 7 of 14
(4,360 Views)
This is a longshot, but may be something to look into if nothing simpler
turns up.

I've just had a quick nosey through the AI read sub-vi, and I notice that
rather than passing a predefined array to the CIN to hold the data, a new
array is created on the fly in each call. In principle the memory for this
should be deallocated automatically, but if you hit a dead end it may be
worth hacking those functions so you can initialise a buffer outside your
loop, hold that in a shift register, and pass it down to the AI VIs for them
to load the new data into.

I've never seen these go wrong in this way, though..

wrote in message
news:8nrdd3$d1c$1@nnrp1.deja.com...
> Forgot to answer this in my last posting! Sorry about that. I am
99%
> sure I am not losing it in arrays. I am not using any shift registers
> to contain data, or have any arrays increasing with size over time.
> Basically, I have the array of data which comes from my AI read, and I
> display that on a graph. That's it. That was the first idea that I
> had, and I am fairly certain that the arrays aren't the problem. I've
> even watched their size (with indicators) to make sure they weren't
> increasing with size during loop iterations.
>
> Mark
0 Kudos
Message 8 of 14
(4,353 Views)
"Craig Graham" writes:

> This is a longshot, but may be something to look into if nothing simpler
> turns up.
>
> I've just had a quick nosey through the AI read sub-vi, and I notice that
> rather than passing a predefined array to the CIN to hold the data, a new
> array is created on the fly in each call. In principle the memory for this
> should be deallocated automatically, but if you hit a dead end it may be
> worth hacking those functions so you can initialise a buffer outside your
> loop, hold that in a shift register, and pass it down to the AI VIs for them
> to load the new data into.
>
> I've never seen these go wrong in this way, though..

Hi,

This problem doesn't have to be related to Labview. A
ll Microsoft
OS'es I know (Win9x, NT4) lack a proper reorganisation of freed memory
into big chunks. The effect is that you get a new memory block
everytime. Some big names in the industry recommend to reboot an NT
box once a week...

I'd recommend a Unix like OS (Linux). The high-level DAQ VI's work
with the tricks posted.

Johannes Niess
0 Kudos
Message 9 of 14
(4,354 Views)
The spontaneous migration of an established system- that often has
dependencies that stretch beyond Labview itself- to a different platform is
not generally the first option when one encounters a bug. 🙂

Johannes Niess wrote in message
news:m2ya1qmu9s.fsf@server.landtechnik.uni-bonn.de...

> This problem doesn't have to be related to Labview. All Microsoft
> OS'es I know (Win9x, NT4) lack a proper reorganisation of freed memory
> into big chunks. The effect is that you get a new memory block
> everytime. Some big names in the industry recommend to reboot an NT
> box once a week...
>
> I'd recommend a Unix like OS (Linux). The high-level DAQ VI's work
> with the tricks posted.
>
> Johannes Niess
0 Kudos
Message 10 of 14
(4,353 Views)
Mark,
Your problem is with the repeated calls to the AI vi's. Repeatedly
creating and clearing the Ai buffer tends to consume memory for some
reason. If you're acquiring the same number of scans for each of your AI
operations, you can re-use the buffer by moving the AI config and AI
clear vi's out of your loop.

Hope this helps..
-Todd


In article <8nh5m0$7hc$1@nnrp1.deja.com>, markwysong@my-deja.com says...
> I have a VI which consists of a loop that basically sets up a buffer,
> sets up an analog trigger, then scans in 120 ms of data when the trigger
> occurs. I read the data out of memory and display it on a graph, use
> AI Clear to clear the acquisition, then perform the loop all over again.
> After a few days, Windows NT starts showing memory errors, and w
hen I
> check what is hogging the memory, it's LabVIEW. If I shut down LabVIEW
> and start over, it starts up at about 17 MB of memory, then steadily
> increases over a couple of days to 110 MB until I get warnings about low
> virtual memory, etc.
>
> Any ideas what could be causing the problem? (The
> "Edit->Preferences->Performance and Disk->Deallocate memory as soon as
> possible" is checked, and I am running with multiple threads)
>
> Thanks,
>
> Mark
>
>
> Sent via Deja.com http://www.deja.com/
> Before you buy.
>


-----= Posted via Newsfeeds.Com, Uncensored Usenet News =-----
http://www.newsfeeds.com - The #1 Newsgroup Service in the World!
-----== Over 80,000 Newsgroups - 16 Different Servers! =-----
0 Kudos
Message 4 of 14
(4,370 Views)