As the document says, set input limit properly can improve the precision of measurement.but in measuremnts, the input limit always has to be setup(call the function “DAQmxCreateAIVoltageChan”) before starting test.
But the real signal to be measured is always varies in a certain range,for example,varies from 5mV to 5V. If you set the input limit to 5V,then digitalizing error when measuring 5V is much larger then measuring 5mV. If the input limit can be adjust the 10mV or 5mV in time ,the measurement precision can be largely improved.Is there any method or an algorithm?
^_^^_^