LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 
Reply

TDMS writing 2 tasks to SD card on compactdaq: failures

I am trying to stream continuous data from 2 tasks to an SD card on a cDAQ-9137.  I thought this was easy, until it wasn't.  I'll list the hardware below, and then what I'm doing, and hopefully someone has insight.

 

Problem:

Using 2 TDMS streams for long term continuous logging to and SD card fails over time in compactdaq 9137 using SD card bay.

 

Hardware:

9137 (running WES7, LV 2016, various 32 GB Sandisk SD cards, 2GB ram)

9223 (2 channels at 100 kHz, Task A)

9223 (4 channels at 10 kHz, Task B)

9223 (4 channels at 10 kHz, Task B)

9223 (4 channels at 10 kHz, Task B)

 

What I did:

All tasks were set up using the DAQ Assistant, and then I used the 'Generate DAQ-mx code' option to get the tasks defined.  That said, I have the same problems with long term writing even just using the DAQ assistant generated helpers in a while loop.  I've attached the task setup for Task A (fastDAQ.vi) and Task B (slowDAQ.vi).

 

I define two tasks, one for 100kHz x 2 channels (task A) and one for 10kHz x 12 channels (task B).  I set both for continuous logging to TDMS file, log only, 100 MB buffer (started smaller but ramped that up hoping it would help).  Initially everything worked fine (short hour long tests), then when trying to do longer runs (10 hours), I went form completing an entire run, to shorter and shorter acquisition times before getting the dreaded 200279 buffer error.

 

I then started breaking up the files using the 'samplesperfile' TDMS property to break files at roughly 1 hour.  That did not make any difference as random 200279 crashes still happened ranging from 30 minutes to several hours of run time.  I was then hopeful that using the 'filepreallocationsize' combined with 'samplesperfile' would help.  Here I set filepreallocationsize = samplesperfile=360009728 for task A and 36003840 for Task B (these values equal about 1 hour).  I think it does help to do this, but the next problem I run into is that the pre-allocation only happens for the first file (up to samplesperfile value).  The next TDMS file made (continuous acquisition) does not appear to pre-allocate.  It hasn't crashed yet, but I know it will.  I can see on the disk that when the initial file was created it was allocated for the full size, then when the TDMS incremented to the next file the sizes started at zero and are increasing as data is being written to the card.

 

To give an idea of file sizes involved, for task A 1 hour = 1.4 GB and task B = 0.84 GB.

 

I definitely have flash problems on the SD card and have tried using different type of brand new cards.  Writing two streams on this system without the pre-allocation fails over time.  Without the pre-allocation, I can see where there can be a big problem writing two streams to flash.  I have also tried fat32, exFat and NTFS formatting without any noticeable difference.  Using performance monitor, I can see where the problems come in as streaming works fine up to a point and then the drive queues get high, then 200279 kicks in.  I chalk all this up to me not combining labview and flash the right way.

 

Labview questions:

1) Is there any way to use both samplesperfile and filepreallocationsize such that every file in a continuous acquisition will be pre-allocated (not just the first one)

2) is there some more optimal way to write to SD cards in this 2 task scenario?

 

If anyone has a recommendation for different storage or known SD cards that are good to use I would appreciate the advice.  I'm starting to wonder if I should plug a hard drive into the USB port and give that a try.  Thanks for any advice.

0 Kudos
Message 1 of 3
(394 Views)

Well in general SD cards get slower as you fill them. (I am guessing something in the was the internal addresses are handled?)

 

Have you thought about using a more robust program architecture like a Producer/Consumer architecture?  A producer Consumer uses a Queue to hold the data between an acquisition loop and a write to disk loop. That way any delays in writing to the SD card do not overflow your buffer.

-------------------------------------------------------------------
Unfortunately, most readers of this Forum, including some real Experts, have not mastered the skill of being able to read the code that Posters fail to post. If we cannot see "what you did wrong", we are unable to tell you how to fix it. (Bob Schor 28 August 2018)
0 Kudos
Message 2 of 3
(391 Views)

Thanks, I wasn't aware of the SD card slowdown issue.  Since it has made it through a full 10 hr cycle and then performance degrades severely over time, I think that might be a secondary issue (main issue being I'm doing something bad to the SD card to begin with).  The data rate ends up being about 0.7M/s.

 

I haven't looked into a different architecture, but I will if there isn't some fix to the TDMS streaming that I'm missing.

0 Kudos
Message 3 of 3
(369 Views)