From 04:00 PM CDT – 08:00 PM CDT (09:00 PM UTC – 01:00 AM UTC) Tuesday, April 16, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Why is the resolution of a time stamp and the millisecond timer different?

I'm repeatedly collecting data from a device via RS-232. I'd like to record the time along with the measurement itself. When I use Get Date/Time in Seconds, it seems to only have a 16 ms level of precision. By that, I mean two consecutive time stamps will never differ by less than 0.016 s. But the Tick Count VI seems to have actual millisecond resolution.

I understand that I understand that timing is system-dependent, especially at the millisecond level. But why do the two timers give me different values?

The time stamp data type is supposed to have 15 digits of precision in the fractions of a second, so that can't be it. Could it have something to do with how I'm taking t
he difference between the two time stamps?

0 Kudos
Message 1 of 5
(5,325 Views)
Get Date/Time in Seconds reads the computer's time of day clock/calendar. Even though the data format may have sufficient resolution, the data from the clock does not. The Tick Count measures clock cycles to millisecond resolution, but restarts when the computer is powered up. It is formatted as U32 so it will overflow after some days (49, I think, without checking the calculation).

If you need timing resolution better than 16 ms, you can mix the two timers. Take a Date/Time reading and a Tick Count Redaing at the beginning. Add the difference between the first Tick Count and the current Tick Count to the Date/Time reading to get the millisecond resolution.

Lynn
Message 2 of 5
(5,327 Views)
hi

i encountered problems mixing both clocks (GetDateTime and Tick Counter), because GetDateTime may go wrong after a while, especially on machines with high CPU-load and on RT-Targets (on cFP-RT the "speed" of GetDateTime is reduced by the amount of CPU occupied by a time - critical loop!). in my case i used a GetDateTime at start of DAQ and exclusively Tick Count during the DAQ.

regards
chris
Best regards
chris

CL(A)Dly bending G-Force with LabVIEW

famous last words: "oh my god, it is full of stars!"
0 Kudos
Message 3 of 5
(5,325 Views)
As with so many others, I'm fighting a losing battle of timer issues.  I created a small test vi to see if I could use the method Lynn described--a gif of the block diagram is included as attachment labview3.gif.

After running this for about 16 minutes I find that the timestamps given by "Get Date/Time in Seconds" are about 6 msec. behind the method of getting one Date/Time stamp before the loop starts and using the millisecond "Tick Count" to measure the time.  This is using LabVIEW 5.0 on a WIN '98 machine with several processes running.  A copy of the front panel is included as labview2.gif.

I created a similar vi using LV 7.1 on an XP machine with few processes running and discovered a discrepancy of about 7 milliseconds after a 3 hour run time.  My question is, which clock is correct?  Is the system clock losing time or is the millesecond timer gaining time or is it a little bit of both?

(I'm still working on a way to used this combined clock for time stamping even after the millisecond clock rollover.)
Download All
0 Kudos
Message 4 of 5
(5,070 Views)
The two clocks are derived from different sources (oscillators) which are not synchronized. Accumulated counts will drift apart over time, as you have documented.

Walk around your house or office and see how many different times are shown on the clocks (including the ones in computers, microwave ovens, VCRs, etc.) and you can understand why philosophers have been asking "What is time? and What time is it?" for so long.

You must decide what time is important to your data. If Time of Day is key, then use the tick counter only to interpolate between readings of the Time of Day clock. For example if you save to disk once per minute, then take a Time of Day reading and a Tick Count reading (reference) after the disk write. On each successive operation you need timed add the difference between the current tick count and the reference Tick Count to the previous Time of Day reading. At the next save generate a new reference Tick Count and continue. This method adds just a bit of housekeeping, but will assure that your timing is never off by more than the error between the clocks over the one minute interval while preserving the ability to get 1 ms resolution.

Remember that any desktop OS may preempt your processes for times much longer than one millisecond.

Lynn
0 Kudos
Message 5 of 5
(5,066 Views)