LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Unexpected Delay in Analog Output

Hello LabVIEW Community,

 

I've been dealing with this issue for the last few days and have yet to find a solution on the message boards or through my own tinkering. I have attached a slimmed-down version of my code with reads an optical encoder and outputs an analog voltage at the same time using DAQmx Read and Write. I am using a While loop to ensure everything is measured at the same instance and the Sample Clock used to trigger the Write command is set to Continuous Samples. I would like to get a measurement frequency in the range of 1kHz, and to do so, I am write my output signal (a sine wave) using 1 Channel - 1 Sample per iteration of the loop. However, when I enable my voltage output, there is a significant delay (~10 seconds) before I start reading the signal on my external oscilloscope. I do not have a delay programmed at all. I also find that if I increase the Sample Clock rate, the delay is reduced, but does not disappear entirely. If I increase the rate too much, the output signal is very messy and the signal has several jumps.

 

I've attached my VI, please let me know if there is anything else I can include to help address the problem.

0 Kudos
Message 1 of 4
(827 Views)

Hi Mike,

 


@Mike_H_NCSU wrote:

I would like to get a measurement frequency in the range of 1kHz, and to do so, I am write my output signal (a sine wave) using 1 Channel - 1 Sample per iteration of the loop.


For samplerates >= 100Hz you should NOT use any "1 Sample" mode of DAQmxRead!

 

  • You should also NOT write to a file in the very same loop, also trying to reach 1kHz write rate!
  • And you should NOT use that waveform generation function just to create a single sample: is some basic math involving the sin() function to complicated?
  • You also don't need that sequence structure…
  • Do you really need to format the loop iterator integer using engineering formatting? Ever heard of FormatIntoString to format 3 values into a string at once? Ever heard of FormatIntoFile when you want to write a formatted string into a file?
  • You don't close your file!?
  • What's the point of an empty case structure?

 

What do you really try to implement? What is the requirement for reading AND writing 1 sample at 1kS/s (or even higher)?

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
0 Kudos
Message 2 of 4
(790 Views)

Remember the Rules of Data Flow (on which LabVIEW is based).  In particular, a Structure (such as a While Loop) or Function cannot exit (or, in the case of the While Loop, go to the next "loop") until everything inside it has run.

 

You are acquiring Analog data at some frequency (possibly 1 kHz -- it is difficult to tell where the "1000" is wired on your diagram).  Inside the While loop, you generate a single output (taking 1 ms), and do lots of other things (making a Graph, taking a single input, generating a function), and you do this for every single point.

 

I'm 90% certain that the problem is in the DAC-section of the code.  Here's what I recommend that you try:

  • Write a "demo" program that just does Digital Output of a single channel at 1 kHz (look at the output with an Oscilloscope).
  • Set up your DAQ device to do 1000 points at 1 kHz, use Continuous Samples.
  • Before entering the While Loop to write to the D/A, run the Function Generator to generate the first 1000 points, and send it into the While Loop using a Shift Register.
  • Inside your While loop, wire the Shift Register with the 1 second of Waveform data to the Data Input of the DAQmx Write command, and use Analog Waveform, 1Chan, NSamp.
  • Also inside the While Loop, but not connected to the Error Line going into or out of the DAQmx Write, generate the next 1000 points of your Waveform and wire it to the "right-hand side" of the Shift Register you just used.  Do you understand what this is going, and how you now have two parallel processes running together, one (the D/A part) that takes 1 second to complete and the other (the Waveform generation) that takes < 1 ms, so it will "always be ready".
  • Feel free to use Timers to check how often this loop runs.

Bob Schor

0 Kudos
Message 3 of 4
(785 Views)

What's your DAQ device for analog output?  A delay on the order of ~10 seconds is almost certainly due to your DAQ device's onboard buffer.  I'm gonna venture that you might have a device with a 8192 sample buffer.

 

As your code is written, you'll iterate very very rapidly at first, writing 1 sample at a time into the task buffer.  Meanwhile, behind the scenes DAQmx is transferring those samples down to the device's onboard buffer until it fills up.  Once it does, you'll have a lag > 8 seconds between writing new samples to the task buffer and when the signal shows up at the physical output pin.   When you set the sample rate higher than 1000 Hz, you're still delayed by 8192 samples, but it no longer takes 8 seconds to burn through them.

 

There are ways to control this behavior, but it's not the same for all devices.  Here are 2 possibilities, odds are they won't both work:

1. use a DAQmx Buffer property node to set the onboard buffer size to something smaller

2. use a DAQmx Channel property node to set the "Data Transfer Request Condition" to "Onboard Memory Empty".  The default is "Onboard Memory Full" which is robust against running out of data but also maximizes latency.

 

If you can describe your system and what you want your app to accomplish, we can probably give better advice about how to get there from here.

 

 

-Kevin P

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
Message 4 of 4
(772 Views)