LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Analog Output Non-Regeneration Too Slow

Hi,

 

I have a 9263 module to generate an analog output voltage but I would like to increase/decrease the voltage via input on the front panel. I am using the Voltage (non-regeneration) Continuous Output example code in Labview 2019 and set the the sample rate to 400kHz (the max for this module). However, when change the amplitude, the changes on the output channel occur after approx. 10s, which is too slow. 

 

 

0 Kudos
Message 1 of 10
(703 Views)

Your front panel has tons of controls scattered over many tabs and some are labeled "voltage". Can you pinpoint where you are changing the voltage?

Can you set all controls to your actual values and make them default? I can't see the 400kHz.

0 Kudos
Message 2 of 10
(696 Views)

Between the OP and Christian's reply I can be assured that no shipping examples are being used. (none of those have controls scattered all over the FP with duplicate labels)

 

I'll be on my phone most of today so I can't open a VI but snippets with attached comments would be enough for anyone to debug. 


"Should be" isn't "Is" -Jay
0 Kudos
Message 3 of 10
(688 Views)

@JÞB wrote:

Between the OP and Christian's reply I can be assured that no shipping examples are being used. (none of those have controls scattered all over the FP with duplicate labels)


I am pretty sure it is a shipping example, but the only controls with "voltage" in the label are the "max voltage" and "min voltage", and these are of course only read once. I suspect they are changing the "Amplitude" of the Waveform. Unless we know the actual values of all controls used, we cannot tell what's going on..

 

altenbach_0-1691412148942.png

 

0 Kudos
Message 4 of 10
(684 Views)

Hi,

 

I uploaded a screenshot with the control that I use to increase/decrease the voltage, and also upload the code with the default values I am using. 

0 Kudos
Message 5 of 10
(670 Views)

@veereshr wrote:

I uploaded a screenshot with the control that I use to increase/decrease the voltage, and also upload the code with the default values I am using. 


Where?

0 Kudos
Message 6 of 10
(665 Views)

That kind of latency is *usually* due to side effects of DAQmx's default buffering behavior.  The default behavior tends to err on the side of caution, keeping all the buffers as full as possible at all times.  But you pay for it in latency.  When you make a change, the new data has to work through all the buffer stages before it can be a real world signal.

 

There will be a task buffer that you can write to from LabVIEW.

There will be a device-level FIFO buffer.  For cDAQ, this is often spec'ed in the chassis.

There may be a USB transfer buffer in between if you your cDAQ is connected via USB.

 

All that being said, the sum total of these buffers don't typically account for multiple seconds when generating samples in the 100's of kHz.  Most related topics I've been in with AO latency issues have involved much lower sampling rates like 100-1000 Hz.

So I *want* to think it's buffer-driven latency, but the numbers don't really seem to add up.  Can you describe how you measure the latency time or tell any other details about how you're running things?

 

-Kevin P

 

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
0 Kudos
Message 8 of 10
(638 Views)

Thanks.

So, even thought you specify a 400kHz sample rate we can see the readback is only 1000s/Sec. With 8k samples that means the waveform settings are only updated every 8 seconds.  Or pretty much what you originally mentioned. 

 

To update the waveform settings more frequently, reduce the samples per generation.


"Should be" isn't "Is" -Jay
0 Kudos
Message 9 of 10
(624 Views)

When you run the vi on your real device, what does the 'Actual Sample Rate' say?   I noticed the 1000 Hz in your screenshot too but thought it might just be a default value that didn't get updated.   

I tried running with a simulated 9263 and it returned 400 kHz as the actual rate, but also issued a warning for violating the device's capability, which is only 100 kHz according to the spec sheet.

 

I don't 100% trust the behavior of simulated devices, but it's the closest I'll be able to get.

 

 

-Kevin P

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
0 Kudos
Message 10 of 10
(604 Views)