02-15-2011 09:00 AM
A Timed Loop is used to create a proper timing at data acquisition. The structure is tested with a period of 1 000 ms and each loop is taking 1 000 ms except the first that elapses in a few ms. My expectation was of course 1 000 ms even for the first loop. I need the measurements to be equally separated in time from the very first point. How do I solve this? Is there any information about the behavior from NI?
Windows XP.
LabVIEW 2010.
Solved! Go to Solution.
02-15-2011 09:06 AM
hasun,
try to use the offset / phase settings of the timed loop.with 1000ms timing.
02-15-2011 09:20 AM
The first iteration happens without any delay once the Timed Loop is started. The 1000ms timed delay is in between the iteration.
You may need to add an initial delay before the timed loop if you need the first reading to be a defined time after some event.
In a time loop
Iteration 0 happens at time 0
Iteration 1 happens at +1000ms
Iteration 2 happens at +2000ms
and so on.