Hi altenback, Thanks very much for your reply. So how accurate is this primitive, is it less than a millisecond out of a 10 millisecond interval, typically? Yes, I will be using it in a regular while loop to build two timers for a polling routine. Just out of curiousity, (for future reference), does this primitive have to be used in a while loop? And why?
Key question: If one doesn't wire anythine to the "Set Start Time" input then does this primitive use the current time when it is invoked? It looks like when running, one just does a momentary Reset to cause it to restart at the current time, correct? I assume that if one holds Reset true, then this primitive is then technically stopped, correct?
Whop's, I got a bit carried away with the questions, sorry about that - Much Regards, Aerospacer
My experience is not so good. Built a micky mouse VI, calling Lapsed time, waiting x msecs then calling lapsed time again, in a seqence structure.
Using a delay of 500 msecs, everthing is rosy. Using 100msecs, the lapsed times is reported as either 93msecs or 109. 1 msec gives zero, 10 msec 16msecs. None of that I can explain.
But using the same VI structure with the msec tick counter, I do get the 1 msec accuracy at 100 msecs. If I check for wrap around of the counter, I can get up to 4 hours or so. If I need more, I'll put the lapsed time VI in parallel and test the size of the result before selecting one or other. At present I don't need that.
Regards Kevin Mills
When Auto Reset is true, the express time VI can not be used to generate an accurate timing interval.
When the interval expires (say 10 seconds) the Elapsed output will be true on the next call (e.g. call is made 10 seconds and 100ms after the timer was started).
But on that call, the timing of the next interval starts at 0, so the 100ms is lost.