# LabVIEW

cancel
Showing results for
Did you mean:

Solved!
Go to solution

## tick count vs get date/time in seconds

Hi

I have a question about different dt's when having "tick count" and  "get date/time in seconds" in a loop.

I have attached the picture that should describe the problem.

Why is there a difference and what would be the best to do it?

Yves

Message 1 of 5
(10,539 Views)

## Re: tick count vs get date/time in seconds

Tick count vs get date/time in second do pretty much the same thing, but if you use the functions for calculating time difference, you must be aware that Tick count will eventually loop around.

Regarding your block digram, you have an array that calculate the time elapsed for each loop iteration, and it is always going to be 150ms, since it is the time-loop's job to make sure that it complete each loop at the exact amount of time specified, which is 150ms in your case.  However, whatever inside the loop can be executed anytime during the 150ms that you have allocated the loop, i.e., you have 150ms to run whatever that is in the loop, but when do you run the code in the loop exactly is undetermined.

However, it seems like using Tick clock would give you a pretty consistant result, 150ms.  It is possible that the execution time for the tick clock is more consistant than the other function.

------------------------------------------------------------------

Kudos and Accepted as Solution are welcome!
Message 2 of 5
(10,453 Views)

## Re: tick count vs get date/time in seconds

The timestamp (which is what the "Get Date/Time..." primitive returns) has (at least in Windows) a resolution of about 16 ms (which is pretty clear in this example). If you want ms accuracy, you should use the tick count primitive.

___________________
Try to take over the world!
Message 3 of 5
(10,442 Views)
Solution
Accepted by topic author Yves

## Re: tick count vs get date/time in seconds

jyang72211 wrote:

Tick count vs get date/time in second do pretty much the same thing, but if you use the functions for calculating time difference, you must be aware that Tick count will eventually loop around.

Regarding your block digram, you have an array that calculate the time elapsed for each loop iteration, and it is always going to be 150ms, since it is the time-loop's job to make sure that it complete start each loop at the exact amount of time specified, which is 150ms in your case.  However, whatever inside the loop can be executed anytime during the 150ms that you have allocated the loop, i.e., you have 150ms to run whatever that is in the loop, but when do you run the code in the loop exactly is undetermined.

However, it seems like using Tick clock would give you a pretty consistant result, 150ms.  It is possible that the execution time for the tick clock is more consistant than the other function.

Trivia:

Differnce between tick counts will always be correct even when it rolls over.

Time Loop determines when it starts, the code executing in it determines when it completes.  If it was the other way around I would be slamming that code that takes 5 minutes to run into a time loop set to iterate once a second.

I agree with Yair that you are seeing the resolution in the OS time stamps.

Ben

Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
Message 4 of 5
(10,435 Views)

## Re: tick count vs get date/time in seconds

Although is is not too likely to show up in a short run such as in your images, it is important to remember that the tick count and the time of day clocks run off two different and unsynchronized oscillators.  This means that their timing accuracies will be different at the parts per million level and that they probably have different drift characteristics.  If you wait long enough you will always see differences in the two timing systems.  If the time of day clock is periodically reset by a network time server, you may also see jumps when the resets occur.

Lynn

Message 5 of 5
(10,424 Views)