From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

I am making a real time program in a daq at mio 16e 10. I need a continous scanning every 1,66 or 2 ms.(only 1 channel).I am using “AI Sample Channel” inside a for loop.

I am making a real time program in a daq at mio 16e 10. I need a continous scanning every 1,66 or 2 ms.(only 1 channel).I am using "AI Sample Channel" inside a for loop.
Sampling is controlled by a "Wait Until Next ms Multiple" but it has 1 ms. accuracy.
How can I solve that?
Is there a higher-accuracy clock?
I do not use buffered scanning because I can not lose any sample.
My adquisicion board does not support multi-buffered scanning.
labview version 5.01
daq at mio 16e 10
0 Kudos
Message 1 of 3
(2,285 Views)
Using high-level software for timing the time between scans in these kind of freqs is useless.

I always thought that the double-buffer mechanism was strictly an NI-DAQ thing, not something your board has to take care of by using hardware buffers.
It could be your old board is no longer supported by a recent NI-DAQ version, but old version should not be bad for your application. Just try the double-buffering and regularily read-in the new values (don't know the details, it has been a while since used last).
0 Kudos
Message 2 of 3
(2,285 Views)
There is a very well documented knowledge base that answers your question. The trick is to use AI Single Scan. The knowledge base can be found at the following link:
http://digital.ni.com/public.nsf/3efedde4322fef19862567740067f3cc/9c484052b5409bc4862569d5005b9c32?OpenDocument
0 Kudos
Message 3 of 3
(2,285 Views)