DIAdem

cancel
Showing results for 
Search instead for 
Did you mean: 

TDMS write memory leak

Hi all,

I developp an application where i stream data on cRIO 9049 to host. On the cRIO i do the acquisition and write data to Network streams. On the Host side (PC windows 10) i read data from the stream and write them to TDMS files. 

- One stream contains acquisition data 24 channels at 102400 kS/s -> 160Mb /s

- The other stream contains statistics data (CPU, RAM ..) .

 

When i launch my application, i see the memory used by the application on the host increase from 100 Mb after 30s. When i reach 2 Gb it crashes. I read differents topics on TDMS memory leaks, and i tried different things without succes:

- i call flush memory after each write

- i close the file every 1 GB

- i tried g tdms api...

 

What's strange is when i deactivate the data from one stream (statistics) i don't have memory leak but as soon as i write data from two stream (even in two differents files) i have this problem...

 

So does anyone have an idea ? Any help would be really appreciated !

  

 

0 Kudos
Message 1 of 3
(1,900 Views)

I am assuming you are writing your application in LabVIEW and not in DIAdem?  So this will probably be a question for LabVIEW forum.  

Otherwise, I would suggest checking if you have a limited reader/writer buffer sizes if you haven't tried yet.

Reference:

https://knowledge.ni.com/KnowledgeArticleDetails?id=kA00Z000000kHrdSAE&l=en-US

0 Kudos
Message 2 of 3
(1,836 Views)

Hi gsklyr!

Thanks for your answer... yep, i think i will write anew post in Labview forum.

Anyway, you're right, i thought it cames from tdms write, but it seems to come from Network Stream. I already read this paper and tried to change the size of the buffer.

It seems it came from using waveform in stream...

I will search a little bit further, maybe try to use array of double instead...

0 Kudos
Message 3 of 3
(1,826 Views)