08-26-2008 07:19 PM
I have a simple FPGA vi that reads a single analog output from within a while loop. The relevant analog input is set to the correct terminal mode, RSE, and the voltage range is set dynamically within the loop. When I apply a variable voltage from a power supply to the analog input, and read the output in a vi run from windows (or the RT target for that matter), it is consistently over the correct value, as verified with a multimeter. The measurement error depends upon the voltage range, and gets worse with larger input voltages. For instance, on the 10V setting, the module reads over by ~130mV for inputs in the 2V range, and over by ~500mV at around 10V.
Clearly there is something amiss with the calibration of the analog input module. I mimicked the code in "C:\Program Files\National Instruments\LabVIEW 8.5\examples\CompactRIO\Module Specific\NI 9205\NI 9205 Basic IO" to make ajustments with the calibration data of the module to the data, and got a bunch of nonsense. Then I removed all but the adjustments having to do with the linearization coefficients, because I read somewhere that the fixed point data type handles the calibration on the FPGA minus the linearization terms. The result was shifted by about 200mV in the wrong direction.
I am running LabView 8.5 with NI-RIO 2.4.1 . Help would be greatly appreciated, thank you in advance!
08-27-2008 01:22 PM
Hi,
Those linearization coefficients make a very small correction to the reading. I'd suggest you keep trying to get accurate results first without using them. The solution to your problem might be a grounding problem as in this discussion:
http://forums.ni.com/ni/board/message?board.id=170&message.id=351379#M351379
Because of the way the 9205 works (multiplexed I/O) you want to make sure the channels you are not using are grounded.
Hope that explains the error.