LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

time behaviour of code inside loops

Hi

I have a loop that acquires some analog data, does some calculations and - if the values match some conditions - sets some digital outputs.

The loop has a "wait until next ms multiple".vi inside to set a sample rate. This timer is parallel to all the rest of the code within the loop.

Now I connected an oscilloscope to the analog and the digital board and watched the time behaviour, the time between the analog rise and the setting of the digital output.

Surprisingly, it depends mostly on the sample rate. If I set the rate (the waiting time) to 1ms, the digital output responds within that 1ms. If I set the rate to 20ms, the digital output also responds within that 20ms with an average of 10ms.

It seems, LabView knows th
at there's plenty of time in a 20ms-loop and thus spreads the execution of the code inside the loop over that time.

Can I change that? Can I tell LabView to execute the code rather at once and then wait for the rest of the time, do nothing or drink some tea but give me a bit more realtimeness?

Thanks for ideas,
Daniel Troendle
0 Kudos
Message 1 of 5
(2,336 Views)
To have your loop function first and your time-delay second: Put your time-delay in a single sequence structure outside (right side) of your loop. Wire at least one data item from your loop to this sequence (such as a boolean 'loop completed' indicator). This will allow your loop to occur 1st, then your time delay 2nd. By wiring a simple bool value from structure-A to structure-B, you control the sequence without the code-hiding affect that stacked sequences gives.
Good luck with it, and have some tea too, Doug
0 Kudos
Message 2 of 5
(2,336 Views)
I have a vi that simply has the "wait until next ms multiple" function in it and Error In/Out clusters. This way I can put it in my data flow so that it delays where I want it. Make sure you connect the Error clusters inside the subvi if you are using error checking. Of course this method depends on using the Error wire for data flow, which I think is a good idea.

Brian
0 Kudos
Message 3 of 5
(2,336 Views)
Try the "wait(ms)" function.
0 Kudos
Message 4 of 5
(2,336 Views)
....
> Surprisingly, it depends mostly on the sample rate. If I set the rate
> (the waiting time) to 1ms, the digital output responds within that
> 1ms. If I set the rate to 20ms, the digital output also responds
> within that 20ms with an average of 10ms.
>
> It seems, LabView knows that there's plenty of time in a 20ms-loop and
> thus spreads the execution of the code inside the loop over that time.
>
> Can I change that? Can I tell LabView to execute the code rather at
> once and then wait for the rest of the time, do nothing or drink some
> tea but give me a bit more realtimeness?
>

It sounds like the I/O knows the sampling rate and the
driver basically waits until the correct time to
make the reading. There are multiple ways of doing
the I/O including
making is a single point read and
using SW timers like wait MS or wait MS multiple to
throttle the loop.

What you will find is that the SW timing will be easy
to implement, but will have the same jitter as the OS.
This may be acceptable for you, but often you will want
to init a periodic HW timed reading operation, start
the operation, and then in the loop you will want to
read from the back side of the buffer using the lower
level AI Read icon and I think a negative offset into
the data. I'm not much of an expert on DAQ, so I'm
sure that I'm not describing it correctly, but I know
that this can be done and it will help avoid the spin
loop that the driver performs if you read in advance of
the data having arrived. Another technique is to use
a DAQ occurrence to notify you when the I/O is available,
again so that you do not make the read call until the
data is ready.

Greg McKaskle
0 Kudos
Message 5 of 5
(2,336 Views)