Showing results for 
Search instead for 
Did you mean: 

How accurate is the Express VI: Elapsed Time?

Hi, I don't suppose that the express VI => Elapsed Time is accurate down to the millisecond region? That would be nice, if it was typically accurate down to the 3 to 10 ms area. If not, does anyone know of a VI or technique that is 9 times out of 10 fairly accurate to the 3 to 10 ms area? Without using loops or timed loops.  Thanks, Aerospacer
0 Kudos
Message 1 of 8
The elapsed time VI is perfectly accurate, but it is not very useful without using any loops. What are you trying to do with it?

LabVIEW Champion. It all comes together in GCentral GCentral
Message 2 of 8

Hi altenback, Thanks very much for your reply.  So how accurate is this primitive, is it less than a millisecond out of a 10 millisecond interval, typically? Yes, I will be using it in a regular while loop to build two timers for a polling routine. Just out of curiousity, (for future reference), does this primitive have to be used in a while loop? And why?

Key question: If one doesn't wire anythine to the "Set Start Time" input then does this primitive use the current time when it is invoked? It looks like when running, one just does a momentary Reset to cause it to restart at the current time, correct? I assume that if one holds Reset true, then this primitive is then technically stopped, correct?

Whop's, I got a bit carried away with the questions, sorry about that - Much Regards, Aerospacer


0 Kudos
Message 3 of 8
You can right-click on the express VI and select "open front panel". Inside is another VI that contains all the code for you to inspect.
The code is very simple, it uses an uninitialized shift register to keep track of the start time. It is also reentrant, meaning that each instance on the diagram keeps its own start time. This also means that it typically should run at least twice and thus should be in a loop (unless you manually set the start time). The first call simply sets the start time and outputs zero.
Of course this express VI it is more useful to measure time than to control timing, because the precision depends on how often you call it. If you set the elapsed time to 1 second but only call the VI every two seconds, the notification that the time has elapsed could be late by another second. 😉
If you need deterministic timing, a timed loop is much more preferable because it controls timing instead of just monitor timing.
You should also be aware that, because of OS limitations, nothing software timed is more precise than one millisecond (or 2ms on some modern Pentium 4 processors) unless you use LabVIEW RT on special hardware running a realtime OS.

LabVIEW Champion. It all comes together in GCentral GCentral
Message 4 of 8
Hi altenbach,  Thanks very much for your well considered reply. Fortunently the 1 millisecond accuracy, (mostly), of the while loop will suit this polling situation reasonably well.  I will probably loop every 2 milliseconds, or do you think that a 1 millisecond loop time would be better and more accurate? Thanks very much for your consideration and time.  Regards, Aerospacer
0 Kudos
Message 5 of 8

My experience is not so good. Built a micky mouse VI, calling Lapsed time, waiting x msecs then calling lapsed time again, in a seqence structure.

Using a delay of 500 msecs, everthing is rosy. Using 100msecs, the lapsed times is reported as either 93msecs or 109. 1 msec gives zero, 10 msec 16msecs. None of that I can explain.

But using the same VI structure with the msec tick counter, I do get the 1 msec accuracy at 100 msecs. If I check for wrap around of the counter, I can get up to 4 hours or so. If I need more, I'll put the lapsed time VI in parallel and test the size of the result before selecting one or other. At present I don't need that.

Regards Kevin Mills

0 Kudos
Message 6 of 8
The VI uses the timestamp data type which, at least in Windows, has a resolution of ~16 ms. You can recreate the basic functionality if you replace the timestamp primitive with the Tick Count primitive, which does have a 1 ms resolution, but which doesn't offer an absolute time value. If you do want an absolute time value, you will also need to record the start time and add to that.

Try to take over the world!
0 Kudos
Message 7 of 8

When Auto Reset is true, the express time VI can not be used to generate an accurate timing interval.


When the interval expires (say 10 seconds) the Elapsed output will be true on the next call (e.g. call is made 10 seconds and 100ms after the timer was started).

But on that  call, the timing of the next interval starts at 0, so the 100ms is lost.






0 Kudos
Message 8 of 8