LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Use the LabView time delay in a while loop instead of using the instrument inherent time loop.

Solved!
Go to solution

Okay, so now my understanding:

  • You want to take a measurement at 1 sec, and that is how you originally set up the instruments.
  • However, using your original setup (the block diagram image that you provided), the time between your loop iterations had a nominal period of 1 sec but jitter of 30 to 40 ms.
  • Therefore you are asking if you can have the instrument set up to send data faster (i.e. 100 ms) but then have your target wait and only return the values that it has received from the instrument at 1 sec intervals.

 

Even if you get that working and are able to collect the data but throw out all but the most recent, you will be subject to your data being up to nominally 100 ms old.  And to get it working with significantly less jitter (sub-1 ms) you will need to use timed structues on a real-time target.



0 Kudos
Message 11 of 13
(773 Views)
And as regards to the instrument buffer, yes it will possibly overflow if you don't read it fast enough. I would eliminate the wait altogether and use the cycle time you pass to the instrument as the multiplier of the chart.
0 Kudos
Message 12 of 13
(768 Views)

thanks.

got it

i think you are correct

0 Kudos
Message 13 of 13
(757 Views)