07-29-2022 10:27 AM
Hello LabVIEW Community,
I've been dealing with this issue for the last few days and have yet to find a solution on the message boards or through my own tinkering. I have attached a slimmed-down version of my code with reads an optical encoder and outputs an analog voltage at the same time using DAQmx Read and Write. I am using a While loop to ensure everything is measured at the same instance and the Sample Clock used to trigger the Write command is set to Continuous Samples. I would like to get a measurement frequency in the range of 1kHz, and to do so, I am write my output signal (a sine wave) using 1 Channel - 1 Sample per iteration of the loop. However, when I enable my voltage output, there is a significant delay (~10 seconds) before I start reading the signal on my external oscilloscope. I do not have a delay programmed at all. I also find that if I increase the Sample Clock rate, the delay is reduced, but does not disappear entirely. If I increase the rate too much, the output signal is very messy and the signal has several jumps.
I've attached my VI, please let me know if there is anything else I can include to help address the problem.
07-29-2022 01:01 PM - edited 07-29-2022 01:07 PM
Hi Mike,
@Mike_H_NCSU wrote:
I would like to get a measurement frequency in the range of 1kHz, and to do so, I am write my output signal (a sine wave) using 1 Channel - 1 Sample per iteration of the loop.
For samplerates >= 100Hz you should NOT use any "1 Sample" mode of DAQmxRead!
What do you really try to implement? What is the requirement for reading AND writing 1 sample at 1kS/s (or even higher)?
07-29-2022 01:11 PM
Remember the Rules of Data Flow (on which LabVIEW is based). In particular, a Structure (such as a While Loop) or Function cannot exit (or, in the case of the While Loop, go to the next "loop") until everything inside it has run.
You are acquiring Analog data at some frequency (possibly 1 kHz -- it is difficult to tell where the "1000" is wired on your diagram). Inside the While loop, you generate a single output (taking 1 ms), and do lots of other things (making a Graph, taking a single input, generating a function), and you do this for every single point.
I'm 90% certain that the problem is in the DAC-section of the code. Here's what I recommend that you try:
Bob Schor
07-29-2022 03:32 PM
What's your DAQ device for analog output? A delay on the order of ~10 seconds is almost certainly due to your DAQ device's onboard buffer. I'm gonna venture that you might have a device with a 8192 sample buffer.
As your code is written, you'll iterate very very rapidly at first, writing 1 sample at a time into the task buffer. Meanwhile, behind the scenes DAQmx is transferring those samples down to the device's onboard buffer until it fills up. Once it does, you'll have a lag > 8 seconds between writing new samples to the task buffer and when the signal shows up at the physical output pin. When you set the sample rate higher than 1000 Hz, you're still delayed by 8192 samples, but it no longer takes 8 seconds to burn through them.
There are ways to control this behavior, but it's not the same for all devices. Here are 2 possibilities, odds are they won't both work:
1. use a DAQmx Buffer property node to set the onboard buffer size to something smaller
2. use a DAQmx Channel property node to set the "Data Transfer Request Condition" to "Onboard Memory Empty". The default is "Onboard Memory Full" which is robust against running out of data but also maximizes latency.
If you can describe your system and what you want your app to accomplish, we can probably give better advice about how to get there from here.
-Kevin P