LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Indicator for a waveform array?

There is no place in your code where you have a setting of '25000'. With a front panel 'Test Frequency' setting of 10 samples/second, your accquisition will take 10 seconds since you are always requesting 1000 samples.

 

p.s. Why do you want to discard all of the samples and just show the latest?

0 Kudos
Message 21 of 32
(1,176 Views)

Vt, the indicators do the same thing, they update around every 11 seconds...Dennis, my rate is 25000 per channel, it's in the vi.

0 Kudos
Message 22 of 32
(1,165 Views)

it looks like the rate of the indicator is dependant on the sample rate. When I change to "10hz", the indicatore updates a lot faster. Is there a way around this? My sample rates need to be as they are to ensure I'm recording 100 readings per cycle...How can I get these indicators to run in realtime?

0 Kudos
Message 23 of 32
(1,165 Views)

I need the indicators to show real time values because before I start this test, I need to adjust a Displacement Transducer to ensure it's within a certain range.

0 Kudos
Message 24 of 32
(1,163 Views)

In the VI you posted, there is NO setting of anything to 25000. I did a search for the text '25000' and it was not found. If you have it in yours, then you posted a different VI.

 

There is a simple arithmatic relationship to sample rate and number of samples. If you are setting a sample rate of 1000 Samples/sec and request 1000 samples, it will take 1 second to return all samples. If you change the number of samples to 2000, it will take 2 seconds to return them all. If you request 500 samples, then it will take .5 seconds to return. In the VI you actually posted, you have set the number of samples to 1000 - not 100 (Ulx Timing) so your statement about 100 samples does not make any sense either. So, I would suggest you set the samples/sec based on the signal frequency you wish to acquire (remember Nyquist) and then adjust the number of samples based on how fast you want the indicators to update.

0 Kudos
Message 25 of 32
(1,159 Views)

So, if my sample rate is 100 at 1hz, and my samples per channel is 100. Then it should update my indicator every 1 second?

0 Kudos
Message 26 of 32
(1,151 Views)

Yes. The basic arithmetic is seconds=(number of samples)/(samples/sec)

0 Kudos
Message 27 of 32
(1,147 Views)

I think the default was -1? I set the read rate to 1 and it seems to be doing what it's suppose to....

0 Kudos
Message 28 of 32
(1,145 Views)

now I'm having an issue with my measurements...if I'm at 10hz and want to take 10 cycles of data...I should get 100 data points...it's coming out to like 320.

0 Kudos
Message 29 of 32
(1,142 Views)

We aren't going to be able to debug the behavior of your DAQ device.

 

You should step back, make a simple VI that just reads the DAQ device and updates a front panel chart and indicator with it.  Put in some indicators that will let you determine the loop iteration speed.  Play with the sample rate and number of samples there.  Once you know a set of parameters that work for you, then try to implement those parameters into your original code.

0 Kudos
Message 30 of 32
(1,129 Views)