I have a simple analog input task configured to aquire a voltage level on a 16-bit M series board at 50 kHz in NRSE mode. The device that the board is measuring outputs a voltage from 0 to 1.5 volts. Becuase I have a number of parallel tasks, I'm interested in optimum throughput an minimum memory usage, so I use the raw version of AI Read (I16 output) rather than AI Read (volts), which outputs floating point doubles. Later on (offline), I'd like to convert these raw I16 ADC values to the approprate voltage for further processing. This involves knowing the gain of the ADC.
When I configure the AI task, I leave the min/max range inputs unwired, meaning they default to +/- 5 volts. I assume this means that the ADC gain is set to make +/- 5 volts the maximum allowable digitization level. Therefore, I determined that if I divide the raw I16 values by 6553.4, I should get the appropriate voltage that I'm looking for. This is because the I16 range is [-2^15, 2^15-1], and thus:
Voltage = Raw value x (5/(2^15-1) volts/bit).
However, I seem to be getting inaccurate results. For example, when I set the device to output 1.26 volts (confirmed independently by a multimeter), the measured raw value from AI Read is approximately 7760, which corresponds to 1.18 volts by the above equation. If I indeed use AI Read (volts) or use MAX to read the voltage, I get the appropriate value of 1.26. This leads me to think my assumption about the ADC gain is incorrect.
Is there a way to determine the appropriate formula for converting raw AI data to voltage? Is my assumption about the gain incorrect?