From 11:00 PM CST Friday, Feb 14th - 6:30 PM CST Saturday, Feb 15th, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Mixed Signal Graph memory leak

Solved!
Go to solution

We've got an application where data is received over TCP, deserialized and fed into a mixed signal graph (8 analog, 2 digital busses of 4 bits each) periodically (about every 500mSec). The analog data and digital data are combined into a cluster and fed directly into the graph's terminal, though the terminal is in a case structure with a 'pause' button. This is built into an executable, which has a slowly but steadily increasing memory usage (about 4MB per hour) - unless the pause button is on. So the culprit must be the graph, no?

 

I made a vi to reproduce it to post here: copied the source vi, removed just the TCP part and replaced it with dummy randomized data. While the copy also has an increasing memory usage, it does so in larger steps every 15 minutes or so. No idea why that is. I'm looking at the memory in perfmon, the 'Private Bytes' counters, if that matters, see attached screenshot: blue is the copy, red is the original and halfway the graph both of them were paused.

 

Looking around I saw this but due to the lack of more information it's hard to tell whether it's related.

 

Is there anything I could do to narrow down the problem? Anyone knows a reason why it could leak? Currently we have Labview2013, maybe someone can try it in a later version to see if the issue is still there.

 

Thanks in advance!

Download All
0 Kudos
Message 1 of 7
(4,633 Views)

Hi, 

I'm Nick, Applications Engineer at NI.

 

I have had a look at your VI and by the looks of it, the memory leak is caused by you not closing references, if you are using Value Property node inside the loop. Try closing all the references (use Close Reference VI) and check if the problem still occurs.

 

EDIT: I was able to check the database for known issues, and it looks like writing an analog and digital waveform in a cluster to the same graph is causing memory leaks and there is no known way to fix it (apart from displaying two graphs).



----------------------------------------------------------------------------------------
Everything has an End, and you get to it only if you keep on
-E. Nesbit
0 Kudos
Message 2 of 7
(4,576 Views)

Hi Nick,

 

thanks for the reply. Regarding references: there's only one front panel control in the loop, if I'd close that it wouldn't be available in the next iteration anymore, right?

 

Do you think the memory leak is going to be fixed in a future version?

 

Thanks!

0 Kudos
Message 3 of 7
(4,541 Views)
Solution
Accepted by topic author Stijn

Hi Stijn,

 

I can tell that R&D are aware of the issue and it is in their work queue, but I can't give any timeline on fixing this problem, as I do not know what are the priorities for improvements.



----------------------------------------------------------------------------------------
Everything has an End, and you get to it only if you keep on
-E. Nesbit
Message 4 of 7
(4,535 Views)

I have stumbled upon this same problem with memory leak in mixed signal graph and have prepared a minimal design which reproduces the problem. Apparently the build array makes the difference.

If displaying an array, the leak is there. If displaying just a single digital waveform there is no leak. In addition the problem is not there when displaying analog waveforms - arrays or not.

Demo is attached.

As far as I can tell there is no obvious workaround because if you have a mixed signal set you _have_ to combine the digital and analogue part using a build array function - and that provokes the memory leak. And by definition you have mixed analogue and digital otherwise there were no point in using this graph at the first place! So basically - the Mixed signal graph does not work.

Any workaround ideas?

Mixed signal graph memory leak - no leak.png

 

Mixed signal graph memory leak - with leak.png

0 Kudos
Message 5 of 7
(4,242 Views)

See the respone from Mikolaj above:

 

I was able to check the database for known issues, and it looks like writing an analog and digital waveform in a cluster to the same graph is causing memory leaks and there is no known way to fix it (apart from displaying two graphs).

 

Which I interpret as: the cause of the memory leak isn't really the building of the array as you assume, it's the graph leaking some or all of built array's memory. And there's no workaround except not using mixed signal graph.

0 Kudos
Message 6 of 7
(4,223 Views)

I noticed that the VI profile tool is not able to detect the memory leak.

Is that an intended function or a bug?

In any case it limits the usability of the profile tool.

0 Kudos
Message 7 of 7
(4,213 Views)