LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Sample Clock Drift (How to manage it?)

Hey All,

 

I'd appreciate your help digging into this peculiarity working with labview and sample clocks. I'm simulating an NI PCI-6250 with SCXI chassis, and haven't had any problems with that, as far as the usual DAQmx task manipulation and execution goes.

 

When we are ready to deploy our DAQ software, we will be running it for hours at a time, collecting data that will need to be reasonably synchronized (within a few milliseconds would be acceptable) with other instruments (not NI hardware, but is integrated with the labview DAQ via serial, ethernet, etc). For now I'm testing the software on a simulated SCXI chassis, since we know the hardware works and use it often with other DAQ software.

 

Where I'm a little stumped, is how to deal with sample clock drift. I have looked at the following knowledge base article on DAQmx device data, and understand that the sample clock is referenced to the system clock at the start of data acquisition, and no further reference to the system clock is made while the hardware relies on its own hardware sample clock to acquire data.

 How Accurate is the Timestamp of the Waveform Returned by my NI-DAQmx Device?

 

As there could be significant clock drift over the span of several hours of data acquisition, I need to account for this drift. I'm trying out adjusting the t0's and dt's of the waveforms using the system clock. It works, but I hope I'm not just hiding an underlying problem that can be solved correctly! The following VI is called immediately upon reading the sample buffer. This is a simple version of the VI, it will also track and save the total drift (fudge factor) over time so we can review the drift of the data we have already acquired.

SubVI t was pretty surprising that the simulated hardware shows about a second of drift for each minute of acquisition. I don't know how the simulated device times its samples. And, I haven't tested it with our hardware yet, but I would be disappointed if that was no better. I would imagine that NI would use fairly accurate sample clocks. Much better than the 1.7% that I am getting with the 'simulated' sample clock. Of course the computer system clock can't be trusted to be very accurate (Windows isn't a real time operating system, and NTP on windows isn't as good as other systems) but it should be good enough for our purposes. We don't generally look at phase relationships between data from different sets of hardware, so if we are a few milliseconds off, it is acceptable. Seconds or minutes at the end of a long session, however, are not acceptable.

 

To that end, the raw data looks good with my hacked timestamping. If anyone has any experience they can share with respect to either tracking sample clock drift, or if there is anything I could try to make the strip chart display the data without the artifacts, I would appreciate it!

 

Thanks a bunch,

 

Nathan

0 Kudos
Message 1 of 22
(7,116 Views)

I have zero experience with NI SCXI, but it might be possible to get serial/ethernet interface modules for this SCXI chassis? I imagine, if a single hardware serves all you interfaces, the timing might be better? But all this is just a wild guess 🙂 Did you contact NI for advice yet? Usually they are very helpful.

0 Kudos
Message 2 of 22
(7,082 Views)

Hi Nathan,

 

AFAIK SCXI chassis are just use for "signal conditioning and multiplexing", they are (in my experience) always attach to some DAQ cards like your PCI-6250.

 

The SCXI chassis does not provide a timestamp or shared variables (AFAIK). Why do you think you can read a time from your SCXI box?

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
0 Kudos
Message 3 of 22
(7,075 Views)

A solution could be to spend some more money, and get an NI-PXI with Real Time OS. You can purchase modules for DAQ, interface modules for Ethernet, GPIB, serial, etc...

I think your existing PCI-6250 is even compatible with PXI, but I am not sure of this.

 

In this way you would have all the incoming data in a single real time unit with all the benefits what an RT OS can offer...

0 Kudos
Message 4 of 22
(7,054 Views)

Hi Blokk,

 

how to you connect PCI devices to a PXI connector?

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
0 Kudos
Message 5 of 22
(7,051 Views)

@GerdW wrote:

Hi Blokk,

 

how to you connect PCI devices to a PXI connector?


Hmm, yep the form factor is different (plus PCI does not offer sockets for common sync as I read...) 🙂

Edit:

There might be some tricks however: http://digital.ni.com/public.nsf/allkb/219E270A357C37AE86256BF4006C16CB

But officially not supported...

 

Anyway, the OP could just buy a PXI DAQ module too. I think using PXI, you can have much more sync options between your modules, compared to a PC based solution: http://www.ni.com/white-paper/13345/en/

 

 

0 Kudos
Message 6 of 22
(7,042 Views)

@GerdW wrote:

Hi Nathan,

 

AFAIK SCXI chassis are just use for "signal conditioning and multiplexing", they are (in my experience) always attach to some DAQ cards like your PCI-6250.

 

The SCXI chassis does not provide a timestamp or shared variables (AFAIK). Why do you think you can read a time from your SCXI box?



As far as I know, you are correct, and this is consistent with my original post. You can't read time from the SCXI chassis, or from the PCI daq card.

 

 When the acquisition task is started, a system clock time stamp (windows system clock) is grabbed, and used for the t0 of the very first waveform. All other t0's of subsequent waveforms are computed from the sample rate and number of samples since that first system clock timestamp. So if the hardware sample clock (not a time clock, but just a frequency to trigger sampling) is not accurate (of if the computer system clock is not accurate), there will end up being compounding drift between your samples and their time stamps.

 

The link to the knowledgebase article in the original post describes how this could happen. What it doesn't explain is what to do when there is clock drift.

0 Kudos
Message 7 of 22
(7,015 Views)

Thanks for the ideas everyone. I might want to look into real time OS hardware...

0 Kudos
Message 8 of 22
(7,007 Views)

I feel a bit confused about what time data info you get in the acquired waveform. As it was written by another poster above, SCXI does not provide timing info along with the data. Ok, so as you wrote, the DAQmx driver "creates" the t0 and so on based on the PC clock:

 

"Where I'm a little stumped, is how to deal with sample clock drift. I have looked at the following knowledge base article on DAQmx device data, and understand that the sample clock is referenced to the system clock at the start of data acquisition, and no further reference to the system clock is made while the hardware relies on its own hardware sample clock to acquire data."

 

I would do in such case the following:

  • I would only acquire array of doubles with the required sampling rate
  • At every iteration of the DAQmx loop, I insert the data array into a Queue with a locally generated single time stamp
  • In the data consumer loop, i check the latest Ethernet/serial/etc. data values using Notifiers
  • I insert the DAQmx data into a channel of a TDMS file alongside with the single time stamp
  • I also save the ethernet/serial/etc data into other channels of the TDMS file, also stamped by the same time stamp I used for the DAQmx data array

The reasoning why I would do the above procedure: Using external devices with serial/ethernet/etc protocols, you just do not have any info about when exactly those data were generated! You see the problem here? You would like to sync the SCXI DAQmx data to those other data sources (serial, ethernet, etc), but you have just no guarantee that those devices supply your PC at exact intervals (max few msec deviations), for example, a usual serial unit will not provide you data with msec sync tightness...

Unless your serial/ethernet external units send you msec resolution time stamps too? Could you share what kind of other signals you need to sync with the DAQmx data? What are those external units with serial/ethernet interfaces?

 

0 Kudos
Message 9 of 22
(6,998 Views)

OK, sounds like you're doing more or less the same thing as I am, just applying your own timestamps instead of fudging the waveform timestamps. In the end we get the same thing. I didn't know if there were other more accurate ways to interrogate the sample clock.

 

I understand that other devices will have their own timing problems as well. That isn't something we can get around.

0 Kudos
Message 10 of 22
(6,993 Views)