From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.
We appreciate your patience as we improve our online experience.
From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.
We appreciate your patience as we improve our online experience.
05-02-2013 12:57 PM
I am trying to do some timing analysis of a DAQmx function call, Sleep() periods and Delay() periods using calls to GetCurrentDateTime() before and after these calls, then subtacting the former from the later. The documentation says the granularity should be about 1 millisecond. I appear to have granularity of 16 milliseconds always -- two different computers, debug or release, always the same. Running in main thread or launched thread, always the same.
Is this normal, expected?
Is there some setting that I have wrong?
05-03-2013 01:39 AM
I have no special clues about that particular function but since you only want to track down function timing and you are not interested in localizing time information and properly formatting it, you could use the simpler function Timer () and get the difference between two calls to this function, which appears to be faster than GetCurrentDateTime (on my machine I have resolution far lower than 0.1 msec)
05-03-2013 08:03 AM
Roberto,
One thing I tried was to replace the GetCurrentDateTime() calls with calls to clock(). Results were virtually identical. I'll try using Timer() and see if there is a difference.
This timing analysis has to do with my other post on why does it take so long for DAQmxSwitchCloseRelays() to return. So far it is as though all measurements are rounded to the nearest 16 milliseconds.
05-03-2013 09:17 AM
On my machine this code produces lines every 0.909 msec (550 lines in 0.5 sec):
#include <utility.h> #include <userint.h> static double t, t0; GetCurrentDateTime (&t0); do { GetCurrentDateTime (&t); DebugPrintf ("%f\n", t - t0); } while (t - t0 < 0.5);
It does not show numbers lower than msecs (that is, I have more or less ten lines with the same value). It seems we are far lower than the 16 msec you are observing, but I am doing nothing except than this in the loop. Maybe your application is spending time elsewhere and this masks timing capabilities of your system.
05-03-2013 09:58 AM
Roberto,
Not sure how to interpret that. You are still seeing granulaty greater than 1 ms, even though the loop is cycled much faster than 1 ms. That is why you get the same number multiple times.
I put the your code in a small project, ran it and got results with big gaps. kept debug window closed while this ran, so writes to the panel are not taking up time, then showed the debug window later. How do I interpret this?
0.000000
0.047000
0.094000
0.156000
0.203000
0.250000
0.297000
0.359000
0.406000
0.453000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.484000
0.500000
0.500000
0.500000
0.500000
0.500000
0.500000
0.500000
0.500000
0.500000
0.500000
0.500000
0.500000
0.500000
0.500000
0.500000
0.500000
0.500000
0.500000
0.500000
0.500000
0.500000
0.500000
0.500000
0.500000
0.500000
0.500000
0.500000
0.500000
0.500000
0.500000
0.500000
0.500000
0.500000
0.500000
0.500000
0.500000
0.500000
0.500000
0.500000
0.500000
0.547000
0.594000
0.656000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.688000
0.703000
0.703000
0.703000
0.703000
0.703000
0.703000
0.703000
0.703000
0.703000
0.703000
0.703000
0.703000
0.703000
0.703000
0.703000
0.703000
05-03-2013 10:40 AM
I don't know how to interpret it!
With my code running in Interactive Execution windows and the debug output one hidden I have a slightly faster response with Timer () than with GetCurrentDateTime (), but I am running a lot faster then 1 kHz on both! With Timer I have a lot more digits while with GetCurrentDateTime () I get only milliseconds, but that's the only difference.
Do you observe a similar pattern with several repeated measures followed by great gaps if you use Timer ()?
05-06-2013 09:27 AM
Timing Analysis wrap-up: Maybe others will find this useful.
GetCurrentDateTime() has a built in granularity of 14-15 milliseconds. That means if you make two successive calls to GetCurrentDateTime(), with something in between, the actual time elapsed will be plus or minus 16 milliseconds of the difference between the two calls. Not very useful.
clock() works much better. Two successive calls will produce results accurate to about 2 milliseconds, with granularity of 1 misec.
Timer() works just as good as clock(), perhaps a bit better, because the granularity is less than 1 millisecond, and appears to be accurate to about 1 millisecond.
Bottom line, if you are doing any kind of timing analysis of your code, don't use GetCurrentDateTime(), use clock() or Timer().