The problem I'm facing is that after running my sequence for couple days Windows starts running low on memory and eventually an error occurs. In the resource monitor I can see that about 95+% of the memory is reserved and it frees up if I click to the "Unload All Modules" in the File menu. I believe this pretty much confirms that I'm facing with memory leaks so I assume that I can unload my LabVIEW modules programmatically by executing the following command in TS:
I have a framework sequence which calls subsequences as "New execution". For example I have subsequence which starts displaying a continuously updating graph. As an experiment I have started my framework and executed the command above after I have called the graphing subsequence. I have expected that the graphing window will either disappear, but at leasts stop working, but nothing happened.
So can you explain me what does this UnloadAllModules() command does and what is its scope exactly?
Solved! Go to Solution.
Thanks for posting on the forums. I found a help article on our website that may help with your issue.
Troubleshooting Memory Growth Issues in TestStand Systems
I hope this helps!
Thanks Patrick. I have checked this and while it can be useful, right now it just seems to be too broad without specifics.
One think I have noticed in the memory grow I'm experiencing is that I dont have really a problem with the "in use" memory, but the growth of the "modified". I find this interesting because if I write a VI like in >>>this<<< video to cause a memory leak then it starts growing the "in use" memory. Once the VI is done with the execution, labview sweeps out the data from the memory so its not a problem anymore.
This is not my case. Once the "modified" memory is grown I can't sweep data out of it at all. No matter if I close labview and teststand, this only changes the "in use" memory consumption not the modified.This means that once my memory starts to become full the only option (I have found so far) is to reboot the computer.
My application doesn't deal with large datasets. I use some sempahores and queues here or there, but absolutely nothing extensive! No huge files are written to the HDD. The PC has 8GB memory and the pagefile is managed by Windows. Previously it was set to 16GB+, and I thought maybe if I let windows manage it then the problem will be solved. (I was too optimistic:)
I use the "unload after sequence executes" option which I hoped that will help me with the problem, but it doesn't.
Do you close all references created in code modules?
Any references passed into Code Modules you don't close as TestStand will handle these but any references created within code modules you need to close before returning to TestStand. Similarly any references you create (in code modules) within loops you need to close before the next iteration.
Alright... So as far as I was concerned I have closed each and every reference I have ever opened. And then I realized that I am periodically calling a VI (quite often actually, once in every 15s) which talks to another device via RS232. I have never explicitly opened a serial reference because the default serial settings simply works, I just started to talk to the device which was working fine. Because I have never explicitly opened the reference I was never really thinking about explicitly closing it.
This was a mistake I assume. Once I have started to close the serial reference the memory leak seems to be gone completely! yay
Thanks for all contributing to the solution.