From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

myRIO 1950 Write Data To File Memory Leak

Solved!
Go to solution

I have a project which requires data logging to a flash drive but it seems to cause a real-time memory leak (or let me know if there is a better term for this issue.)

Attached is a version of the data logging portion of my project to represent what I am trying to do. Basically take a cluster of data and write it to a file every so often. (I have a For Loop with a constant to simplify the code but in my actual project I use a Queue to buffer data and write-to-text-file every 10 seconds or so).

 

Now if you were to run the code and keep an eye on "FreePhysMem" you'll notice it decreases rather quickly. I need this system to run for at least a 24hrs.

 

If you enable all four Diagram Disable structures (which closes the file each loop) then the memory loss is significantly better. It still drops memory over time which I am not happy with but I'll tackle one issue at a time.

 

snippet.png

0 Kudos
Message 1 of 11
(1,650 Views)

Hi cdprs,

 

can you test your system after applying some changes to your code?

See this:

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
0 Kudos
Message 2 of 11
(1,618 Views)

Made the changes with no improvement (except code readability which I appreciate).

For reference, "FreePhysMem" starts at about 135,000 and steadily drops to about 75,000 after 2 minutes. I updated the default values to demonstrate the memory effect. 

New snippet and project zip attached.

snippet2.png

0 Kudos
Message 3 of 11
(1,577 Views)

I have narrowed the culprit down to the 'Write to Text File' VI.

I added a case statement around it and when I enable the 'Write to Text File' the memory drops and when I disable it the memory doesn't budge. 

cdrps_0-1623964580514.png

 

0 Kudos
Message 4 of 11
(1,567 Views)

Try putting a "Flush File" after the "Write to Text File" and see if that helps.

0 Kudos
Message 5 of 11
(1,564 Views)

No luck with Flush File.

Memory drops just as fast as before.

 

I tried using "Write Delimited Spreadsheet" then realized it's just a prebaked VI with the same basic File I/O components but this one opens and closes the file every loop. I thought that was bad practice but that was before I learned to buffer data to write a big chunk all at once instead of one sample at a time.

 

Maybe the fact I am working on a linux based real-time environment also plays a role in this issue or at least makes it more prominent.

 

I'll mark this as resolved with the takeaway that I should buffer data, write-to-file every so often, and open/close the reference every time I write the buffered data to file.

 

Final snippet and project zip attached for completion.

 

snippet3.png

0 Kudos
Message 6 of 11
(1,555 Views)

I spoke too soon.

 

Once I select "Append to file" the memory starts dropping again. And appending to file is definitely needed.

 

It seems the main limitation is file size and memory size on a realtime target. The whole file has to be loaded into memory in order to append and write new data. What doesn't make sense is why the memory isn't released once the file is closed.

 

So even if I were to limit each file size and create a new file at some size limit, I don't think it would solve the memory problem because the old files are never released from memory.

 

I feel I am back at square one, with more knowledge and fewer ideas.

0 Kudos
Message 7 of 11
(1,548 Views)

@cdrps wrote:

It seems the main limitation is file size and memory size on a realtime target. The whole file has to be loaded into memory in order to append and write new data. What doesn't make sense is why the memory isn't released once the file is closed.

So even if I were to limit each file size and create a new file at some size limit, I don't think it would solve the memory problem because the old files are never released from memory.


I feel like this isn't true... I'm running code that looks like the following on a cRIO-9045 (LabVIEW 2019) and it will happily run without losing memory (at least, that I've noticed)

cbutcher_0-1623983178470.png

This is a bit more "stuff" than is really necessary for this discussion, but you could completely ignore the bottom shift register and then you're left with something similar to what you seem to have:

  • Open a File, and "Initialize Session" before the loop (this happens once per cRIO restart for me - but the file opening VI is called again near the end of the loop)
  • Use "FreePhysMem" (along with a bunch of other data sources) to get resource usage
  • Convert to a string for logging, then use Write to Text File to log (the condition here is if logging is enabled and a file was opened, it can be configured to not log since it takes up physical disk space and if I'm not checking for anything, that just means I have to occasionally empty the cRIO disk so it doesn't get file errors from no hard-drive space)
  • Sometimes flush the file
  • Sometimes open a new file (Close File, Clear Error (the reference might not exist if the logging is not enabled - probably another case structure could avoid this, not sure why I chose the Clear Error, hmmm...), Create/Open/Replace File set to Create mode)
  • After the loop finishes (basically a global Stop button, usually not pressed) close the file.

 

I don't know if the myRIO is somehow specially bad, but I'd expect writing to file should not keep the data in memory after it is written and flushed.

 


GCentral
0 Kudos
Message 8 of 11
(1,535 Views)
Solution
Accepted by cdrps

Hopefully I am not speaking too soon again but I found a solution.

I was using LabVIEW 2016 (on both the myRIO and computer) and after upgrading to 2019 the problem goes away.

This problem occurred on more than one myRIO with LV2016 installed so I know it isn't specific to a single myRIO.

 

This kind of solution isn't very satisfying but at least now it works.

0 Kudos
Message 9 of 11
(1,507 Views)

Fun Times.  

I just did the exact same thing you did.  Using an sbRIO 9637.  So much time wasted because I assumed it was something I had coded that was causing the problem.  

Unfortunately I don't have the option of upgrading to LabVIEW 2019.  Using 2018 SP1 at the moment.  

It appears that I can run for about 20 hours before I start running out of memory.  I might try to put an auto-reboot in the code.  Which isn't much of a solution if you expect your system to collect data and run continuously for years.  

 

Relec

0 Kudos
Message 10 of 11
(1,247 Views)