LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

accuracy of labview timing using DAQ Assistant

My question is regarding the acquisition of multiple channels of data (32) using DAQ Assistant with a NI-cDAQ 9178.  Basically I'm acquiring 32 channels of data using DAQ Assistant, then splitting the dynamic data into separate channels and saving each channel as a separate tdms file.  The vi uses continuous sampling, and just appends data to each file as it is acquired.  My question is about the time column in the tdms file.  I can either, 1) have a separate time colulmn for each file (so 2 columns total for each file), or 2) rely on the initial time stamp in the file and the known sampling rate to reconstruct the time column in post processing.  Is reconstructing the time vector from the initial timestamp in the tdms header and the sampling rate equivalent to directly recording the time vector as a separate column?  I am also splitting the files into 5 minute chunks to keep the sizes down.  Will the hardware keep good enough timing over the 5 minutes to rely only on the initial timestamp and sampling rate, or is it better to record the entire vector and have double the data (since I am splitting each channel into its own file on disk).

 

Thanks for the help,

Mike

0 Kudos
Message 1 of 6
(2,704 Views)

The accuracy of a timestamp is only as good as the Windows clock.  The accuracy of the differences between timestamps depends on how you are acquiring data.

 

If you are acquiring data 1 sample at a time, then the timestamp is again only as good as the windows clock.  If  you are doing continuous data acquisition of multiple samples, then the timing between samples is as good as the DAQ device you are using.

 

It doesn't matter whether your file has only 1 timestamp at the top and a series of data rows below it, or a separate timestamp for each datapoint.  The timing between data points is as good as the DAQ device.

0 Kudos
Message 2 of 6
(2,701 Views)

thanks for the reply,

I have a follow up question.  If I'm reading this properly, it sounds like a new labview timestamp is taken every time a new chunk of data is taken in labview.  For instance, if I acquire continuously, and acquire say 25k samples at a time, is a new timestamp taken every 25k samples?   Also, does NI hardware use the computer clock to sample, or is it timed internally on the hardware?  I would assume it is timed on the DAQ.  

 

Thanks again,

0 Kudos
Message 3 of 6
(2,674 Views)

@mibrady2 wrote:

thanks for the reply,

I have a follow up question.  If I'm reading this properly, it sounds like a new labview timestamp is taken every time a new chunk of data is taken in labview.  For instance, if I acquire continuously, and acquire say 25k samples at a time, is a new timestamp taken every 25k samples?   Also, does NI hardware use the computer clock to sample, or is it timed internally on the hardware?  I would assume it is timed on the DAQ.  

 

Thanks again,


If sampling continuously, the timing is done on the DAQ board.

 

I'll have to find some time to play to see about the timestamp.  I haven't ran into that situation yet.


GCentral
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
0 Kudos
Message 4 of 6
(2,663 Views)

Hello mibrady2,

 

For more information on this topic please take a look at this knowledgebase article.

 

Regards,

 

Izzy O.

Applications Engineer

National Instruments

  

Message 5 of 6
(2,650 Views)

thank you, that article was very helpful.

0 Kudos
Message 6 of 6
(2,635 Views)