From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Handling large data using tdms data logging and analysing it in matlab

Hello,

I am using USB 6356 data acquisition board for generating and acquiring data at sampling rate = 1M S/sec. I am acquiring the data for 1000 seconds in tdms format. This means, total 1M x 1000 = 10^9 samples. This creates approximately a 9GB tdms file. Now I am using this Matlab code to convert tdms data to mat data. This Matlab code works fine for smaller tdms files ( < 2Gb) but for 9Gb tdms file, it is not working (may be due to the fact that my PC has 8Gb RAM which is not sufficient to hold large file into memory during conversion).

Therefore, I decided to split this large 9Gb file into smaller files using splitFiles.vi (see the attachment), as suggested by Andrew in this post. I have following questions:

 

1) Can splitFiles.vi split my 9Gb file into around 18 (or 19) files of 500Mb each efficiently without any data loss?

2) In my data acquisition vi, I have used producer-consumer loops to acquire and log data (see the attachment). I want to understand that how my vi is able to store 9Gb data in one tdms file given that my RAM is only 8Gb? Does tdms logging continuously releases the RAM to have enough memory?

 

With smaller files, the matlab code is easily converting the tdms data to .mat data. But is there any other better way to do all these stuffs in more efficient way. 

 

Thanks in advance!

Download All
0 Kudos
Message 1 of 4
(3,297 Views)

@Amartansh13 wrote:

2) In my data acquisition vi, I have used producer-consumer loops to acquire and log data (see the attachment). I want to understand that how my vi is able to store 9Gb data in one tdms file given that my RAM is only 8Gb? Does tdms logging continuously releases the RAM to have enough memory?


The data does not have to stay in memory when it has been written to the disk.

 

Something that will make your life A LOT easier is to use the DAQmx Streaming capability.  Use the DAQmx Configure Logging VI before you start the task.  This will allow you to have DAQmx stream the data straight to disk.  It is even more efficient than the Producer/Consumer setup.  Now let's take it a step farther.  You can use a DAQmx Read property node and use the Logging.SamplesPerFile.  That will enable a capability of DAQmx creating a new file once X samples have been streamed to a file.  It even uses incremental file names for you.


GCentral
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
Message 2 of 4
(3,288 Views)

Thanks a lot for your reply. 

Can you give some example vi showing use of daqmx read property node?

Also, is there any problem in using splitFiles.vi to split the large file before converting it in matlab?

According to you, which method is better:1) saving data into snall multiple files Or 2) saving data in large single file and then split it into multiple small files using splitFiles.vi?

0 Kudos
Message 3 of 4
(3,278 Views)

On mobile, so cannot make an example right now.  Just dig around in the DAQmx Advanced palette and you will find it.

 

Haven't looked at the split.vi.  But my opinion is it is better to create multiple smaller files that you can handle than make a large file and then split it up.


GCentral
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
Message 4 of 4
(3,254 Views)