LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Best Practices for Real-Time Data Logging in LabVIEW Using NI DAQ Devices

Hi everyone,

I'm currently working on a project that involves real-time data logging using LabVIEW and NI DAQ devices. While I've managed to set up basic data acquisition, I'm facing challenges with ensuring reliable and efficient data logging, especially when dealing with high-frequency signals. I'm particularly interested in: Optimal buffer sizes and memory management techniques. Strategies to prevent data loss during high-speed acquisition. Best practices for structuring the LabVIEW code for scalability and maintainability. I would greatly appreciate any insights, experiences, or additional resources you could share to help me enhance the reliability and efficiency of my data logging setup.

I've come across several resources that have been helpful:

https://www.youtube.com/watch?v=GBhJk5Tnshc 

https://www.theengineeringprojects.com/2023/08/introduction-to-labview.html 
https://www.viewpointusa.com/labview/labview-data-acquisition/ 

https://www.m4-engineering.com/multi-sensor-data-using-labview/ 

 

0 Kudos
Message 1 of 6
(401 Views)

The simplest way is to just configure DAQmx to stream it straight to a TDMS file: https://www.ni.com/en/support/documentation/supplemental/21/ni-daqmx-high-speed-streaming-to-disk.ht...

 

That'll save all of the data you capture. If you're trying to save subsets of data (e.g., monitoring continuously at 100 kHz for an hour, and need to save 100 ms of interesting content) you'll have to do it manually.

 

In general though, you want a producer/consumer architecture. And you're allowed to have multiple of these in one program! For example, you can enqueue your data into multiple queues to be removed by multiple loops. You might have one loop for your UI, which takes your 100 kHz data and averages it out, updating your screen every 100 ms. Another loop might write each datapoint to disk. (I'd recommend TDMS files for storage, but you can use CSV's too- they're just a LOT bigger on disk.)

 

Look up some of the project templates that ship with LabVIEW. IIRC one is "continuous measurement and logging" that might point you in the right direction.

Message 2 of 6
(388 Views)

Mostly just reiterating what Bert said...

 

1. For logging data, the DAQmx Configure Logging is by far the most efficient route to take. It easily beats out a Producer/Consumer for logging data to disk.

 

2. It is best to have a loop that does nothing but read from the DAQ. Use a queue or notifier to pass the data to other loops that may need the data. There should be no explicit wait in this loop.

 

3. There is a rule-of-thumb to read 100ms of data each iteration of your loop. Obviously, the number of samples you read will depend on your sample rate.

 

4. For most setups, do not worry about setting the buffer size. The default is usually larger than you will need, especially if you are following point 3 above.


GCentral
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
Message 3 of 6
(374 Views)

@crossrulz wrote:

3. There is a rule-of-thumb to read 100ms of data each iteration of your loop. Obviously, the number of samples you read will depend on your sample rate.

 


Slightly off topic, but this is the first I've heard of this. Is there some history behind it? Granted, I don't save my data to disk (instead I process it in chunks and send it out via TCP), so is it HD write speed or something along those lines that would suggest you should write more frequently instead of in larger chunks? The handshaking alone adds enough delay to justify doing mine in 2-second chunks from a network performance aspect instead of more frequent smaller transfers, but I realize that writing to disk is apples to oranges when compared to sending out network traffic.

 

0 Kudos
Message 4 of 6
(276 Views)

@DHerron wrote:

@crossrulz wrote:

3. There is a rule-of-thumb to read 100ms of data each iteration of your loop. Obviously, the number of samples you read will depend on your sample rate.

 


Slightly off topic, but this is the first I've heard of this. Is there some history behind it? Granted, I don't save my data to disk (instead I process it in chunks and send it out via TCP), so is it HD write speed or something along those lines that would suggest you should write more frequently instead of in larger chunks? The handshaking alone adds enough delay to justify doing mine in 2-second chunks from a network performance aspect instead of more frequent smaller transfers, but I realize that writing to disk is apples to oranges when compared to sending out network traffic.

 


I can say from experience that it is easy to go down the rabbit hole of buffer size, samples to download, etc. Here are some things that I have found; they are corner cases and really only apply to high stream rates.

  1. 100 ms of data works 99% of time. (For low sample rates, then a longer interval also works.) However, sometimes using an allowed File Write Size value is better. To use the File Write Size you need to switch temporary to Log Only Mode and see what allowed value is closest to 100ms.
  2. The sample size, buffer size and file size need to be multiples of each other or evenly divisible, like 1/2 etc. and be a multiple of the disk sector size.

Below are some screen shots of a simulated instrument running at a high rate using the above two points. It has been running for the last 20 minutes without issue in Log and Read mode, but not saving the data. 

 

The values I had were: 

Sample Rate 10MSa/s/ch 8 Channels

N Samples to Read 125952 Samples ~ 12ms of data!

Buffer Size 80609280 Samples ~ 8s of data

File Size 20152320 Samples ~ 2s of data

 

File size is 1/4 of buffer Size

Buffer Size and File Size both multiples of NSamples to Read and disk sector size(512)

 

You can see that DAQmx can be efficient, as the CPU load is low.

 

mcduff_0-1746627519013.png

 

 

 

mcduff_2-1746627519023.png

 

 

Message 5 of 6
(270 Views)

Have you evaluated FlexLogger? There's even a free version. That product has already figured out the DAQ and logging pieces of the puzzle. And, an off-the-shelf product takes a lot of the burden of maintainability off your shoulders.

Doug
Enthusiast for LabVIEW, DAQmx, and Sound and Vibration
0 Kudos
Message 6 of 6
(184 Views)