Real-Time Measurement and Control

cancel
Showing results for 
Search instead for 
Did you mean: 

Analouge Input-Output delay

Solved!
Go to solution

Hi

I am a new labview user. I am trying to implement a controller in real time using labview. For that purpose, I started doing analouge input output exercise. As a part of learning process, I was trying to apply input on a system, get the data and feed it back through analouge channel. However, I noticed there is a signficant delay between input and output, It was about 1 ms. Then i thought of doing the simplest exercise. I generated an input signal read it through labview and feed it back again. So, basically its task for ADC and DAC only. But still, it has the same amount of delay. I was under impression that if i do hardwared time data read and write, it would reduce the delay. But still, no change. Can anyone please help me out regarding that? or there would always be this amount of delay.

For the ref i am attaching the .vi file.

 

Any kind of help would really  be appreciated 

0 Kudos
Message 1 of 5
(1,551 Views)

This is a more generic DAQmx question, you might  get faster and better answered when psoting thsi question in the DAQmx forum https://forums.ni.com/t5/Multifunction-DAQ/bd-p/250

 

But if I would to have to create a Real-time inout output with minimum delay I would program it in the FPGA. But even then I would expect some delay, because it take multple clock cycles to get an analog in and mutiple to write an analog out.

 

I don't know which FPGA or DAQmx could lead to the shortest delay. There might be DAQmx driver options that faster, but it would take someone with more DAQmx experience to answer that question.

Regards,
André (CLA, CLED)
0 Kudos
Message 2 of 5
(1,513 Views)

Thanks Andre  for you suggestions. 

 

Don't you think 1 ms delay is too much? I was expecting delay would be in micro seconds or less a 1 ms second

 

 

0 Kudos
Message 3 of 5
(1,491 Views)
Solution
Accepted by topic author zayed001

DAQMx is more aimed at data acquisition - reading in data, processing, data storage, display. It is good for handling large and fast sample data, but it is an abstraction of the detail of what is running on the hardware to make it easy to programme.

You can do real-time feedback control in DAQmx - i.e. caclulate an control action from the measurement (from AI) and wite to AO - but the fact you are not working directly with the low level code on the hardware means there will be limitations.

 

LabVIEW RT (Real-time) is more appropriate as it works at the lower level - typically working on point-by-point functions for real-time control. However, if you use that in Scan Interface mode (the default) then the fastest sample time (and hence delay between AO and AI) it will give is 1msec. Remember this code still runs on an operating system even though real-time there is overhead in computations.

 

For faster you will need LabVIEW FPGA as this allows you to run code directly on an FPGA (silicon, no operating system), so this will allow you to get down to the fundamental speed of the ADC and DAC.

 

In addition to what flavour of LabVIEW you develop using, the timing also depends on your hardware:

  • if you are using a cDAQ, cRIO or something else as that will dictate which LabVIEW is used
  • the fundamental sampling of the DAC, ADC modules connected and their internal technology (delta-sima, or SAR)

Hope this helps.

Consultant Control Engineer
www-isc-ltd.com
0 Kudos
Message 4 of 5
(1,487 Views)

thanks. I understood the reason. 

 

 

0 Kudos
Message 5 of 5
(1,453 Views)