I'm trying to get a labview program to output an array of data at precise time intervals, but I am unsure of how to do this. I have the user input certain parameters that are required to generate the array required. Now I need to have the array values outputed at a certain rate (for example, 10 values a second). I was thinking about using a loop, but I feel that as the program becomes more complex, the time required to go through a loop, even if I am using a timed loop, may be longer than the output rate, causing the output array values to not output at the times required. When the DAQ reads voltages, it reads it according to an internal clock. Is there a way to output an array of values to the card and have it output the values based on an internal clock?
I was also wondering how the DAQ output works. Does it just read a value and straight output it? What would the buffer be for then?
If you have a card that has hardware timed writes, you set sample rate in basically the same manner as a hardware timed read. If you create a waveform datatype, you can also select to use timing information from that. Just look at the examples.
Sorry, what examples are you referring to that explains how to write an array output according to the write rate? Does this require a loop of some kind or is the array just read by the DAQ all at once and the DAQ outputs a voltage according to the array at each time interval?
Additionally, I have a question about using the analog output in conjunction with the analog input. I have a system that generates a signal depending on a signal that is inputed to it.
My program has a user interface that uses a scroll bar to select the voltage outputted to the system and the program reads in simultaneously the system's output. I learned recently that the DAQ read function is timed very well using crystal oscillators, so I have no problem about the data that I am getting. However, I want the output to the system to be outputing at the same rate as the output FROM the system is being read. I use a loop to constantly refresh the value that that should be output to the system from the scroll bar. I fear that because the loop does not loop as precisely and timely as the DAQ read rate, I will get mismatched values. How should I remedy this?
Additionally, it would be great if I could get the value of the voltage being outputted by the DAQ as it is outputting it double check that the value being given to the DAQ to write out is actually the value being written out. To do this, I was thinking about feeding it into an analog input, however, I am not sure if that will cause any offset in timing due to this routing of signals.
I assume you use an NI DAQ card. With these it is possible to sync AO and AI on the hardware level. Whenever you need to compute the output of the next cycle from the input, you get the time delay and jitter from your OS (at lead 1 to 2 ms in Win, you can get your response better but without grantee).
Because your eye is a bit slower, you won't realize this if using a scroll bar,...
Post code if possible.
Here's an example that should at least get you started in the right direction with synchronizing AI and AO:
You probably don't need or want the External Start Trigger (in the example you can take out the DAQmx Trigger VI in the AI task). I'd also suggest configuring the "Delay from Sample Clock" Timing Property on your AI task (assuming you're on a board that supports it) to give a couple of microseconds for the AO to settle.
You don't have to physically wire the AO channel back in, as the DAQmx driver lets you read from internal channels on most boards (e.g. "Dev1/_ao0_vs_aognd").
And the examples I was referring to can be found by going to Help>Find Examples. To be more specific, look under Hardware Input and Output>DAQmx>Analog Generation>Voltage.
This whole discussion is moot, however, if your DAQ device does not support hardware timed analog generation. You should be very aware of the specs and know this. You have not provided the model number so it's impossible for anyone here to know.