04-19-2011 08:59 AM
Dear forumers,
I am pretty new to the labview programming so excuse me if this was posted before.
I use: Labview 8.5 + PCI 6229
I want to create a real time feedback: to control analog output based on analog input. The task is very simple - if the ai1 (analog input 1) voltage is larger than a constant, then decrease the voltage of the ao1 (analog output 1) by a constant. Otherwise increase the voltage of ao1 by the same constant.
All should work for 10 seconds.
I have tried to implement this using 2 daq asistants and various methods. The problem is that I get the whole signal recorded from the ai1 ( 10 seconds) and then modified and released to the output (another 10 seconds). The execution takes 20 seconds instead of 10 and it is not realtime at all.
Is it a must to make vi in "run continuous" mode to make it work ?
Please help :}.
I am attaching 2 examples.
1. Main idea what I am trying to accomplish
2. Very simple example that still does not work in real time simultaneously.
04-19-2011 09:19 AM
It's simple arithmetic. The number of samples divided by the sample rate is how long an acquisition will take. That is where the 10 seconds comes from. Reduce the number of samples read at once (i.e. 10). Then put everything in a loop that will terminate after 10 seconds. The Elapsed Time function is useful for that.
Do not use the run continuous button. It is for special debug use.
04-19-2011 09:45 AM
Thank you for your suggestion.
I understand the simple arithmetic you are talking about. However I do not understand why it takes 20 seconds instead of 10 to complete the task. In other way to say - why it reads the whole input first (10 seconds) then write the output ( 10 seconds).
Why doesn't ir read and write simultaneously?
If I use Elapsed Time function - what feedback response time can I expect ? Is it good to use Elapsed Time if i need fast response time? In my application i would I need about 1 milisecond response time. Currently in the example the rate is 50 Hz, later I would like to change the rate to 1000.
04-19-2011 10:08 AM
I suggest you dump the DAQ Assistant crutch, open the example called "Multi-Function_Synch AI-AO.vi" and learn how to use the DAQmx primitives. I don't really know if the DAQ Assistant is capable of doing what you're trying to do but learning to actually code it yourself with DAQmx is a very useful skill. You will outgrow the Express VIs and Assistants eventually and DAQmx is pretty amazing once you get past a short learning curve.
Plus, many of us don't use the Assistant and if you use real LabVIEW we can help you better. Give it a look and post back with any questions...
04-19-2011 01:26 PM
@nooto wrote:
Thank you for your suggestion.
I understand the simple arithmetic you are talking about. However I do not understand why it takes 20 seconds instead of 10 to complete the task. In other way to say - why it reads the whole input first (10 seconds) then write the output ( 10 seconds).
Why doesn't ir read and write simultaneously?
If I use Elapsed Time function - what feedback response time can I expect ? Is it good to use Elapsed Time if i need fast response time? In my application i would I need about 1 milisecond response time. Currently in the example the rate is 50 Hz, later I would like to change the rate to 1000.
You simply don't understand the underlying paradigm of LabVIEW - dataflow. You have a dependency between the first DAQ Assistant and the second. The second simply will not start until the first finishes. There is no way around that given your condition that the output depends on what you read.
If you want 1 millisecond response time then you will have to swithc to a real-time OS. Windows does not have that kind of resolution and there is way to much jitter.
04-19-2011 04:06 PM
Dear NIquist and Denis,
Thank you for the explanations.
Can you let me know what response time can I expect to achieve with labview and PCI-6299 for this kind of application I am trying to accomplish ? Just approximately.
04-20-2011 03:04 AM
Sorry, mistyped the PCI model PCI-6229
As for the "Multi-Function_Synch AI-AO.vi". It is a nice example, however there is no data relation/connection between the AI and AO, only synchronization. I am not really sure but the case might have been solved in this post:
However I can not access the attached dagmar - ao based on ai 10.vi 31 KB, it's the 8.6 version Labview, while I use 8.5
Anyone can help me to convert this ? :}
04-20-2011 08:44 AM
Your 6229 DAQ can read samples at 250KHz (4uSec) and write at 883KHz (1.1uSec). But, since you want to analyze the input and change the output based on that analysis, you're now limited to Windows (WinDOZE) slow and non-deterministic response to real-time signals. 1 milli-second is about all you can hope for. It will help to shut down other processes and set-up your VI for highest thread priority but to do what you want to do you really need a real-time OS.
04-20-2011 08:51 AM
P.S. The code in your link won't help but if you want to look at it, post it here: http://forums.ni.com/t5/LabVIEW/Downconvert-VI-Requests/m-p/1067229
I use 8.5 and 2010, no 8.6 though.