From 04:00 PM CDT – 08:00 PM CDT (09:00 PM UTC – 01:00 AM UTC) Tuesday, April 16, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Sample Clock Drift (How to manage it?)

If you need to sync an external unit to an NI DAQmx device/module, often a physical trigger signal is used, but this depends on the available options of that external device and the NI one...

 

In the end we get the same thing.

Hmm, you might be right. Of course in my measurements I only see that there is a few msec fluctuations between the time stamps, since I use simple software timing, and I do not compare the PC time to an absolute reference "atom clock" 🙂 You are right about that, the PC clock might drift away a bit more from the "real absolute" time. But a few tens/hundreds of mseconds drift after a one day measurement is really a problem? What kind of physical quantity you measure which would require such high accuracy in timing?

Besides, keep in mind, Windows does regular time sync via internet (if it is connected). However, I have no idea about the details, neither the effect in your case (i guess if you use only the waveform, and not a time stamp, only the CPU ticks count; if Time Stamp is used, that can be effected by internet time syncing I guess)...

 

Anyway, NI offers DAQ cards with internal clock (either for RT and non-RT platforms), there you can get the timing accuracy what you want. But still, the other external data sources are not synced to your DAQ data...

0 Kudos
Message 11 of 22
(2,467 Views)

Thanks for your input. I don't imagine the system clock would drift much, I think you are right. I shouldn't worry so much about it. I was confused by the significant drift of the simulated device (I don't know how a simulated device works, but the hardware has its own hardware clock signal that I should hope is pretty accurate). Hardware tests will be soon to follow!

0 Kudos
Message 12 of 22
(2,456 Views)

There are forum users here with several orders of magnitude more experience than I have, lets see if someone jumps in later on and clarify these very interesting questions 🙂

0 Kudos
Message 13 of 22
(2,448 Views)

OK, so you asked "..is there a correct way?" and in fact there is.  Minimize the clock drift by following the correct environmental installation specs.

  • Clean the fan filters regularly
  • Install Slot Blockers in unused slots
  • Ensure adequate clearances for air flow.

This keeps your clocks cool.  Often forgotten but oh so important!  Dusty filters restrict airflow heating up your clocks and increasing drift.  Missing slot blockers forces the air the fan moves to go through the empty slots rather than past the components that are heating up.  I even once saw a developer complaining about his "Crappy Chassis" when it was on its side with the top and rear backed into his cube walls AND the bottom used as one bookend for folders, manuals and magazines.  (All intake vent slots 100% covered fan exhaust muffled 100% ) No slot blockers and all empty slots away from the controller (Slots 1-5 filled slots 6-8 empty)  No surprise, fixing that removed all noticeable drift

 

Don't forget it ever again!Smiley Wink


"Should be" isn't "Is" -Jay
0 Kudos
Message 14 of 22
(2,444 Views)

1)

If you are doing a hardware timed acquisition and start it once and let it run... the samples will only drift as much as the on-board clock of your DAQ device.... check the spec of the card you are using.

 

2)

If you DAQ device has a spec that is too much, then you can go with timing modules with crystal controlled blah blah clock that are rock solid that can be used to control the acquisition.

 

3)

Your DAQ device can not always run at the freq. you want. There are a fixed set of sample rates that are determined by the module itself. When you request a sample rate, DAQmx will find the closest freq. and use that. The is a property node for an input task that will ell you what the actual sample rate is. That dt is the number that is used to crate the timestamp in the Waveform data.

 

Time Stamp = (start Time) + (Number samples acquired X dt )

 

The PC clock will wonder and using a time server will keep it close but be aware that the time server may set the system clock back! Don't be surprised if you note a new system time that is actually older than a previously noted time.

 

4)

If you need to develop a pseudo-time stamp for a serial widget, get the system time when the acquisition starts along with the ms tick count. Get the tick count every time you get something from the serial widget and add the delta tick-count to your start time. Be aware the tick count rolls over every 39 days or so. As long as you are subtracting the roll-over math should cover you... as long as you are subtracting U32s.

 

Why is the simulated device drifting?

 

I can not speak from experience but... you are not by chance using a sample rate that is a multiple of a tenth of a second or another value that can not be represented in binary are you?

 

Ben

Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
Message 15 of 22
(2,439 Views)

Well, nothing beats supercritical helium cooling! 😄

0 Kudos
Message 16 of 22
(2,440 Views)

@Blokk wrote:

Well, nothing beats supercritical helium cooling! 😄


Unless you chill it down to a near vacuum and start picking off the remaining atoms with a laser


"Should be" isn't "Is" -Jay
0 Kudos
Message 17 of 22
(2,436 Views)

@JÞB wrote:

@Blokk wrote:

Well, nothing beats supercritical helium cooling! 😄


Unless you chill it down to a near vacuum and start picking off the remaining atoms with a laser


Yes, but then you do not have cooling medium 😉 Helium is also not a good cooling medium: too low heat capacity. But you need the low end temperature for superconductors anyway...

 

There are nice systems with water cooling, or multi-stage Peltier-coolers. Ok, below the dew point you also have to worry about condensation...

Hmm, but a water cooling system would be super geek, with a built-in tropical aquarium 😄

OP, sorry for the bit off topic 😄

0 Kudos
Message 18 of 22
(2,432 Views)

Ben wrote:

4)

If you need to develop a pseudo-time stamp for a serial widget, get the system time when the acquisition starts along with the ms tick count. Get the tick count every time you get something from the serial widget and add the delta tick-count to your start time. Be aware the tick count rolls over every 39 days or so. As long as you are subtracting the roll-over math should cover you... as long as you are subtracting U32s.

 

Ben


Ben, regarding to your point 4, if we use the "High Resolution Relative Seconds.vi" instead of the "Tick count (ms)", we avoid the roll-over problem, yes?

0 Kudos
Message 19 of 22
(2,428 Views)

@Blokk wrote:

Ben wrote:

4)

If you need to develop a pseudo-time stamp for a serial widget, get the system time when the acquisition starts along with the ms tick count. Get the tick count every time you get something from the serial widget and add the delta tick-count to your start time. Be aware the tick count rolls over every 39 days or so. As long as you are subtracting the roll-over math should cover you... as long as you are subtracting U32s.

 

Ben


Ben, regarding to your point 4, if we use the "High Resolution Relative Seconds.vi" instead of the "Tick count (ms)", we avoid the roll-over problem, yes?


I would think you are correct... within normal conditions. Somewhere along the line (maybe LV or LV 7) NI changed the time stamp from a double because the time when coming near when double was not enough bits. So as long as your "dt" is less than 100 years or so...

 

I think you are correct!

 

Ben 

Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
0 Kudos
Message 20 of 22
(2,421 Views)