From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.
We appreciate your patience as we improve our online experience.
From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.
We appreciate your patience as we improve our online experience.
06-17-2021 01:51 AM
I have a project which requires data logging to a flash drive but it seems to cause a real-time memory leak (or let me know if there is a better term for this issue.)
Attached is a version of the data logging portion of my project to represent what I am trying to do. Basically take a cluster of data and write it to a file every so often. (I have a For Loop with a constant to simplify the code but in my actual project I use a Queue to buffer data and write-to-text-file every 10 seconds or so).
Now if you were to run the code and keep an eye on "FreePhysMem" you'll notice it decreases rather quickly. I need this system to run for at least a 24hrs.
If you enable all four Diagram Disable structures (which closes the file each loop) then the memory loss is significantly better. It still drops memory over time which I am not happy with but I'll tackle one issue at a time.
Solved! Go to Solution.
06-17-2021 02:59 AM - edited 06-17-2021 03:00 AM
06-17-2021 01:46 PM
Made the changes with no improvement (except code readability which I appreciate).
For reference, "FreePhysMem" starts at about 135,000 and steadily drops to about 75,000 after 2 minutes. I updated the default values to demonstrate the memory effect.
New snippet and project zip attached.
06-17-2021 04:16 PM
I have narrowed the culprit down to the 'Write to Text File' VI.
I added a case statement around it and when I enable the 'Write to Text File' the memory drops and when I disable it the memory doesn't budge.
06-17-2021 04:28 PM
Try putting a "Flush File" after the "Write to Text File" and see if that helps.
06-17-2021 05:09 PM
No luck with Flush File.
Memory drops just as fast as before.
I tried using "Write Delimited Spreadsheet" then realized it's just a prebaked VI with the same basic File I/O components but this one opens and closes the file every loop. I thought that was bad practice but that was before I learned to buffer data to write a big chunk all at once instead of one sample at a time.
Maybe the fact I am working on a linux based real-time environment also plays a role in this issue or at least makes it more prominent.
I'll mark this as resolved with the takeaway that I should buffer data, write-to-file every so often, and open/close the reference every time I write the buffered data to file.
Final snippet and project zip attached for completion.
06-17-2021 07:21 PM
I spoke too soon.
Once I select "Append to file" the memory starts dropping again. And appending to file is definitely needed.
It seems the main limitation is file size and memory size on a realtime target. The whole file has to be loaded into memory in order to append and write new data. What doesn't make sense is why the memory isn't released once the file is closed.
So even if I were to limit each file size and create a new file at some size limit, I don't think it would solve the memory problem because the old files are never released from memory.
I feel I am back at square one, with more knowledge and fewer ideas.
06-17-2021 09:33 PM
@cdrps wrote:
It seems the main limitation is file size and memory size on a realtime target. The whole file has to be loaded into memory in order to append and write new data. What doesn't make sense is why the memory isn't released once the file is closed.
So even if I were to limit each file size and create a new file at some size limit, I don't think it would solve the memory problem because the old files are never released from memory.
I feel like this isn't true... I'm running code that looks like the following on a cRIO-9045 (LabVIEW 2019) and it will happily run without losing memory (at least, that I've noticed)
This is a bit more "stuff" than is really necessary for this discussion, but you could completely ignore the bottom shift register and then you're left with something similar to what you seem to have:
I don't know if the myRIO is somehow specially bad, but I'd expect writing to file should not keep the data in memory after it is written and flushed.
06-18-2021 03:46 PM
Hopefully I am not speaking too soon again but I found a solution.
I was using LabVIEW 2016 (on both the myRIO and computer) and after upgrading to 2019 the problem goes away.
This problem occurred on more than one myRIO with LV2016 installed so I know it isn't specific to a single myRIO.
This kind of solution isn't very satisfying but at least now it works.
03-09-2022 01:31 PM
Fun Times.
I just did the exact same thing you did. Using an sbRIO 9637. So much time wasted because I assumed it was something I had coded that was causing the problem.
Unfortunately I don't have the option of upgrading to LabVIEW 2019. Using 2018 SP1 at the moment.
It appears that I can run for about 20 hours before I start running out of memory. I might try to put an auto-reboot in the code. Which isn't much of a solution if you expect your system to collect data and run continuously for years.
Relec