LabWindows/CVI

cancel
Showing results for 
Search instead for 
Did you mean: 

GetTickCount, CVI

How long is a tick count in GetTickCount CVI function? 

0 Kudos
Message 1 of 6
(789 Views)

It’s a Windows API function, not a LabWindows CVI one: https://learn.microsoft.com/en-us/windows/win32/api/sysinfoapi/nf-sysinfoapi-gettickcount

 

The value is the number of milliseconds since an arbitrary time (usually the startup time but the resolution can be 10 or even 16ms, depending on the underlaying OS.

Rolf Kalbermatter
My Blog
0 Kudos
Message 2 of 6
(742 Views)

I figured that much but not sure.  Thank you for confirmation.

0 Kudos
Message 3 of 6
(566 Views)

After digging a little deeper I start to doubt the tick count could be between 10ms to 15ms.  Here is my debug code in CVI

 

DWORD32 timeStart = GetTickCount();

Delay(startDelaySecond);

DWORD32 timeStop = GetTickCount();
DWORD timeDiff = timeStop - timeStart;
printf ("tick counts : %d\n", timeDiff);

 

I varied the "startDelaySecond" between 0.8s to 0.5s and the tick count printout is very close to the millisecond that I set.  For example when I set Delay(0.8) the tick count is 800 +/-2 counts or so.  We know that NI Delay() is very accurate with a 1ms resolution.  So what does that mean? Each tick count here is 1 ms? 

0 Kudos
Message 4 of 6
(547 Views)

First you have two different issues here. The value itself is guaranteed to be in milliseconds since an arbitrary moment. But if you quickly read the GetTimerTick() value you do not necessarily have a value that increments by one. Instead you could read 10 times 10024 and then suddenly 10034 or even 10040 since the value is only updated every so many milliseconds. 

 

But the important point in the MSDN documentation is: it CAN have a resolution of 10 to 16 ms depending on the platform. It does not have to and on more modern systems often doesn't but that can vary on other things such as if you use multimedia timers or not (and I believe LabWindows/CVI at least used to configure the multimedia timers to 1ms resolution, just as LabVIEW did).

 

The important point is that it can have a higher resolution but doesn't have to, so your investigation is of course interesting but pretty inconclusive nevertheless as you can not expect the results that you find on your system to replicate on another system.

 

If you need guaranteed higher resolution you should instead use the Windows high resolution timers (QPC). They also don't guarantee a specific resolution but are using the CPU counters that operate on a much higher frequency but of course depending on the CPU clock frequency also quite varying values. But you have a function that you can call to get the actual frequency of the monotonous increasing counter values. https://learn.microsoft.com/en-us/windows/win32/sysinfo/acquiring-high-resolution-time-stamps

 

Rolf Kalbermatter
My Blog
Message 5 of 6
(543 Views)

Thank you for spending the time I appreciate it. I implemented the high-resolution method that you suggested and it came out the same as the previous method but now I have confidence that the timing of my test is within specs.  

0 Kudos
Message 6 of 6
(532 Views)