From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Unwanted delay when using DAQ in Labview

Hello again NI. I have a problem.

In Labview, I have constructed the "circuit" seen in the picture (file below).

The signal generator goes into a DAQ-board, and back into a graph in Labview.

However, if I try to adjust things about the signal, such as the amplitude, or frequency, or offset, there is a delay of about 5-10 seconds before I see any change on the graph.

 

This is very unwanted, and I therefore wanted to know if there is any way I can diminish the delay, or maybe completely remove it?

I hope someone can help me.

0 Kudos
Message 1 of 18
(2,485 Views)

You will have to attach your VI so we can see what your settings are in those Express VIs (DAQ Assistants and Simulate Signal).  But my thought is that at least one of those is taking the 5-10 seconds, preventing your loop from iterating and stopping the update.


GCentral
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
Message 2 of 18
(2,472 Views)

My bad. Here is my VI.

0 Kudos
Message 3 of 18
(2,468 Views)

Hi Brosa,

 

as far as I can see all ExpressVIs should execute in 100ms.

 

Try this:

Move the DAQAssistent reading the input signal with the chart into its own loop.

Add another chart after the SimulateSignal ExpressVI to verify the waveform creation part.

Run both loops in parallel.

 

How is AO0 and AI0 connected?

 

Btw. learning DAQmx is simple by reading this and examining the example VIs coming with LabVIEW!

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
Message 4 of 18
(2,460 Views)

Hi there.

I am a bit unsure of what you wanted me to do with the loops. Here is how I understood it (look at file).

Have I done what you suggested?

Also, for your second question, AO0 and AI0 is physically connected on the Data acquisition board through a BNC-cabel.

 

0 Kudos
Message 5 of 18
(2,453 Views)

Hi Brosa,

 

nearly complete: just add one more chart and connect with the signal coming from SimulateSignal ExpressVI…

(And use AutoCleanup.)

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
Message 6 of 18
(2,446 Views)

Okay, I've done what you suggested.

I can see that the chart (or graph) connected directly to the simulation signal is immediately changing when I change the offset, frequency etc.

However, the graph connected to the DAQ still has a big delay time , and I really don't want that 😞

Is there anything else I can do?

0 Kudos
Message 7 of 18
(2,434 Views)

Hi Brosa,

 

you changed the DAQAssistent reading the AI signal from reading 100 samples to 1k samples: now it takes 1s to display new data instead of just 100ms…

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
Message 8 of 18
(2,410 Views)

Are you going directly from the analog out to the analog in? Is there something in between the analog out and analog in?

Tim
GHSP
0 Kudos
Message 9 of 18
(2,408 Views)

I also noticed that your signal output is send 100 samples and your DAQ Assistant2 is send 500 samples. You should make those the same.

Tim
GHSP
Message 10 of 18
(2,404 Views)