Multifunction DAQ

cancel
Showing results for 
Search instead for 
Did you mean: 

daqmx read scaler data vs waveform data output

I'm doing multiple channel DAQ using 'DAQmx Read' with sampling rates varying from 20-1000000 Hz. All the data must be time stamped with IRIG time.  I have used 'niSync Get Time' with 'DAQmx Read' with data output in array and waveform formats.  Is one more accurate than the other? 

My users want a time channel when they examine the data in a TDMS file.  They don't want to have to extract the time from a waveform channel in DIAdem. This seems to lend itself towards array data output.  But I'm concerned about accuracy especially at higher sampling rates.

Are there any guidelines on what output is best to use in 'DAQmx Read'?

0 Kudos
Message 1 of 2
(2,022 Views)

I have not used IRIG, but this sounds like a variation of a thread I was in a little while back.

 

The fundamental issue is that you've got 3 sources of timing information -- 1 from IRIG, 1 from your DAQ sample clock (establishing "dt" in your waveforms), and 1 from the PC clock (establishing "t0" in your waveforms).   Ultimately, that's 2 too many.  You'll need a scheme to establish a master time source and then either correct the others or live with the discrepancy they report.

 

Here's a thread I was in with a decent but imperfect scheme to use PC clock as master and correct the waveform "t0" field.  

 

Unfortunately, I'm not familiar with IRIG usage and can't give detailed suggestions how to apply the thoughts from that thread to your case.  Ideally, the IRIG device will allow you to route hardware timing signals for the DAQmx task to use.  If it can export both timebase and trigger, then you'll automatically be able to get hardware sync, though you'll likely still need to manipulate the "t0" element of the waveform to reflect IRIG time rather than PC clock time.

 

 

-Kevin P

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
0 Kudos
Message 2 of 2
(1,993 Views)