02-02-2017 01:17 AM
Hello NI community!
While looking for the best way to measure a time difference (much like tic toc in Matlab), I came across the deltaus() function for realtime sequences. The description of which is "Returns the duration of the current system timestep in microseconds."
What does this mean exactly? What is meant by "system timestep"?
Btw. my current implementation for measuring time difference is:
startTime=tickcountus() DO SOMETHING deltaT = tickcountus() - startTime
would it be better to use seqtimeus() or maybe even deltatus() here?
Greetings,
MaWei
05-02-2017 02:19 AM - edited 05-02-2017 02:24 AM
Hi MaWei,
System timestep simply indicates the smallest detectable incremental change of your time information. This would depend on your configuration of course. If you call this this function on different architectures, you should get some different results.
Regarding the second question, if you are trying to measure the time that a code portion takes to execute (known as benchmarking), your code looks fine. You are simply taking the difference of two time stamps via counters.
Cheers,
Bart