From 04:00 PM CDT – 08:00 PM CDT (09:00 PM UTC – 01:00 AM UTC) Tuesday, April 16, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

timing on first iteration wrong

Solved!
Go to solution

Hi folks!

 

I have a problem with a timer, that gives me a wrong result (only) on the first iteration (see code below).  This test code is suppose to measure the time the loop needs to iterate x-times.

I would expect to get allways the same time difference  (i.e.  "period"  =  10 (msec) if the settings used in the screenshot) but in fact, the first iteration results allways in a smaller number -

but every iteration after the first one yields the right result.

 

Many thanks for your help in advance!

 

Luke

Download All
0 Kudos
Message 1 of 5
(2,723 Views)

Your 1000 millisecond wait until next runs in parallel with everything else.

 

So in that first iteration, your intial time stamp, and the one in the case structure will happen very quickly because the comparison code executes very quickly.  Only after the wait until next finishes will the loop iterate again.

 

Also, you are using the wrong wait function.  Wait until Next causes the VI to wait only until the system clock reaches a multiple of that value.  So if your VI starts 1/2 second into your system clock, the first loop iteration will only take 1/2 second.

 

If you used the regular Wait function, then the loop iteration will actually take the full amount of time you are trying to wait.

Message 2 of 5
(2,711 Views)

Thanks RavensFan, Knight of NI, for the fast response!

 

I just exchanges the "wait function". It becomes more stable, but unmasked an additional error. The first iteration is one cycle to short but I don't see why...

 

 Let' say, the wait function is set to 500msec: For the first iteration it tells me that the time needed is 4.5 sec- but for each iteration later. If the wait function is set to 1000msec I ' ll get 4sec etc.

 

Best wishes,

 

Luke

 

 

0 Kudos
Message 3 of 5
(2,693 Views)
Solution
Accepted by topic author BrainDrain

As I said, on the very first iteration, the code that executes to check the times is happening in parallel to the code that is doing the time difference calculation.  Actually, the error in your time will show up in the Nth iteration when the time differences have happened N times, but you've only executed N-1 waits.  The Nth wait is happening in parallel (thus after) the Nth time calculation.

 

The next time around you won't see the problem problem because the Nth wait will have occurred before the Nth+1 iteration.

 

You need to set up a dataflow dependence (possibly a simple 2 framed flat sequence) in the loop so that the calculations happen after the Wait function.

0 Kudos
Message 4 of 5
(2,681 Views)

Thanks for the help! I finally made it work by adding a flat sequence as you had suggested. I'd tried this before but I had forgotten to move the clock initializing the feedbacl node infront of the timer ...

I have attached the final screenshot.

 

Best wishes,

 

Luke

0 Kudos
Message 5 of 5
(2,647 Views)