LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Timed loops running slow for small period

I ran in to some timing issues in my code. I realized that with a timed loop, if I set the period to 1, 2, 3, or 4 it runs very slow. I have attached a very simple VI, as well as a screen shot.  Anyone able to tell me what is going on? 🙂

 

Download All
0 Kudos
Message 1 of 11
(3,307 Views)

Sorry, posting by phone.

Try to use "tick count" for the sensor instead.

0 Kudos
Message 2 of 11
(3,290 Views)

The code I posted isn't about finding the time elapsed or anything like that.  It is simply to illustrate the problem of my timed loop running slowly when I set the period to a small value.  

0 Kudos
Message 3 of 11
(3,287 Views)

The loop cannot go to the next iteration until all code has finished. If I remember right, the get time in seconds has more overhead, so please try my suggestion.

0 Kudos
Message 4 of 11
(3,278 Views)

Well, you do maintain original phase on missed periods (that's probably an oversight).  Tossing a pair of defer panel updates property nodes around the loop will reduce the number of missed periods.  But, I doubt you'll ever see i=4999 (exactly 0 late periods)  Not on a non-deterministic OS


"Should be" isn't "Is" -Jay
0 Kudos
Message 5 of 11
(3,274 Views)

altenbach:  Thanks for your reply, and sorry for not making myself clear: The code inside that timed loop takes about 175 nanoseconds to execute. That isn't my problem.

 

Jeff:  My issue is that the timed loop is SUPPOSED to be used for "... VIs with multirate timing capabilities, [and] precise timing."  I assume that I am doing something wrong in setting up or configuring the loop. After all, why offer a timed loop for said precise timing if it is impossible to be accurate?

0 Kudos
Message 6 of 11
(3,255 Views)

Your missing the point.  Timed loops let you do some things that while loops won't but they cannot turn a Windows OS into a deterministic system.  Now, drop one of those beasties onto a Real-Time target and you can expect less "Jitter."


"Should be" isn't "Is" -Jay
0 Kudos
Message 7 of 11
(3,247 Views)

The OS also has other things to do.

 

If I count the number of "finished late (i-1)" occurences, it is between about 2 and 130 here, so clearly your hardware/OS combination simply cannot handle it. This is not the fault of LabVIEW. As has been said, you neet LabVIEW RT on a dedicated system.

 

What are you actually trying to do? Maybe a hardware timed acquisition would be more reasonable.

0 Kudos
Message 8 of 11
(3,233 Views)

Timing is definitely possible in Windows.  In fact, calling the kernel32.dll library directly gives accurate timing down to the microsecond (or less).  I'll just redo my project and call the C library directly. Thank you both for your help.

0 Kudos
Message 9 of 11
(3,227 Views)

I don't get it, what is the difference to your LabVIEW code?

 

Your first vi ends at 5.00029 Sec and the second vi with the dll call takes 5.00429.

 

So what have you verified?

 

In your second vi, what is the value of I from the outermost loop when it complete? 

Do you really think that the innermost loop can only wait for 1 µs ? (microsecond)


Your second vi does not prove anything about that the timing is better when you call a dll on a windows operating system.

0 Kudos
Message 10 of 11
(3,164 Views)