LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Sample Clock Drift (How to manage it?)

Your real data acq sample clock will not drift by nearly 1 second per minute of acquisition.  But it *will* drift relative to the real-time system clock you query from the OS.  You say you want to be within a few msec over the course of several hours.  To illustrate, let's say 5 hours.  That's 18e6 msec.  If we also allow 18 msec drift, we're talking about aiming for agreement to within 1 part per million.

 

Neither your data acq board nor your OS time of day clock will be nearly that accurate or stable.  You need to approach this differently.  You're going to need to choose one single master timing source that you query to correlate all the different I/O sources.

 

An approach I tend to favor is to let the data acq sample clock be the master.  I'll then run an edge counting task to count cycles and let my comm processes query that count as a way to "timestamp" their incoming data streams.  I'll usually mark the start of my data acq task with the regular system time function too, and then I can derive a pseudo-time-of-day from my start time and the combo of sample rate & # samples.

 

 

-Kevin P

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
Message 21 of 22
(952 Views)

I agree Kevin!

 

I have extended that idea to be able to do critical high-speed timing across multiple platforms.

 

Used one of those high-speed fancy master clock modules and then "T" the clock signal using same length cable to a pair of counter boards in two distinct PXI Chassis.

 

Start the count task on both machine and then start the clock and the count for both systems should be identical.

 

I used that to measure propagation time of various messaging schemes between machines. The precision is limited only by how fast you can run your master clock and how fast the counter boards can count.

 

Edit...

 

Since we are on the topic I should also mention;

 

IEEE-1588 which is a protocol to sync clocks across networks using a PPS signal.

 

AND

 

IRIG-B that uses a PPS single good to 1 usecond from  GPS signal.

 

 

Spoiler

I believe the military can get better resolution! I BELIVE that played a part in getting cruise missles launched from subs ships in different oceans/seas and a variety of aircraft to all show up in Bagdad at the same time.

 

 

 

 

 

Ben

Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
0 Kudos
Message 22 of 22
(930 Views)