From 11:00 PM CST Friday, Feb 14th - 6:30 PM CST Saturday, Feb 15th, ni.com will undergo system upgrades that may result in temporary service interruption.
We appreciate your patience as we improve our online experience.
From 11:00 PM CST Friday, Feb 14th - 6:30 PM CST Saturday, Feb 15th, ni.com will undergo system upgrades that may result in temporary service interruption.
We appreciate your patience as we improve our online experience.
10-20-2016
10:26 AM
- last edited on
12-19-2024
09:21 AM
by
Content Cleaner
We've got an application where data is received over TCP, deserialized and fed into a mixed signal graph (8 analog, 2 digital busses of 4 bits each) periodically (about every 500mSec). The analog data and digital data are combined into a cluster and fed directly into the graph's terminal, though the terminal is in a case structure with a 'pause' button. This is built into an executable, which has a slowly but steadily increasing memory usage (about 4MB per hour) - unless the pause button is on. So the culprit must be the graph, no?
I made a vi to reproduce it to post here: copied the source vi, removed just the TCP part and replaced it with dummy randomized data. While the copy also has an increasing memory usage, it does so in larger steps every 15 minutes or so. No idea why that is. I'm looking at the memory in perfmon, the 'Private Bytes' counters, if that matters, see attached screenshot: blue is the copy, red is the original and halfway the graph both of them were paused.
Looking around I saw this but due to the lack of more information it's hard to tell whether it's related.
Is there anything I could do to narrow down the problem? Anyone knows a reason why it could leak? Currently we have Labview2013, maybe someone can try it in a later version to see if the issue is still there.
Thanks in advance!
Solved! Go to Solution.
10-21-2016 04:01 AM - edited 10-21-2016 04:06 AM
Hi,
I'm Nick, Applications Engineer at NI.
I have had a look at your VI and by the looks of it, the memory leak is caused by you not closing references, if you are using Value Property node inside the loop. Try closing all the references (use Close Reference VI) and check if the problem still occurs.
EDIT: I was able to check the database for known issues, and it looks like writing an analog and digital waveform in a cluster to the same graph is causing memory leaks and there is no known way to fix it (apart from displaying two graphs).
10-24-2016 02:41 AM
Hi Nick,
thanks for the reply. Regarding references: there's only one front panel control in the loop, if I'd close that it wouldn't be available in the next iteration anymore, right?
Do you think the memory leak is going to be fixed in a future version?
Thanks!
10-24-2016 03:15 AM
Hi Stijn,
I can tell that R&D are aware of the issue and it is in their work queue, but I can't give any timeline on fixing this problem, as I do not know what are the priorities for improvements.
02-25-2017 11:39 AM - edited 02-25-2017 11:54 AM
I have stumbled upon this same problem with memory leak in mixed signal graph and have prepared a minimal design which reproduces the problem. Apparently the build array makes the difference.
If displaying an array, the leak is there. If displaying just a single digital waveform there is no leak. In addition the problem is not there when displaying analog waveforms - arrays or not.
Demo is attached.
As far as I can tell there is no obvious workaround because if you have a mixed signal set you _have_ to combine the digital and analogue part using a build array function - and that provokes the memory leak. And by definition you have mixed analogue and digital otherwise there were no point in using this graph at the first place! So basically - the Mixed signal graph does not work.
Any workaround ideas?
02-26-2017 02:26 AM
See the respone from Mikolaj above:
I was able to check the database for known issues, and it looks like writing an analog and digital waveform in a cluster to the same graph is causing memory leaks and there is no known way to fix it (apart from displaying two graphs).
Which I interpret as: the cause of the memory leak isn't really the building of the array as you assume, it's the graph leaking some or all of built array's memory. And there's no workaround except not using mixed signal graph.
02-26-2017 10:08 AM
I noticed that the VI profile tool is not able to detect the memory leak.
Is that an intended function or a bug?
In any case it limits the usability of the profile tool.