Hi All,
I'm writing a program for measure voltage samples. I have 13 analog input channels. Each of one of them has a signal between 7.3V and 0.0V.
When the input signal value is about 7.3, my program is reading 6.8 V.
When the input signal is about 0.0V is reading 0.55V (very high!!!).
With the same signal, I have executed the test panels, and the panel shows 7.42V and 0.0002V.
I don't understand why the test panels get a different measure!!!
My program uses continous sampling, 2000 samples per second. The measure is done in RSE mode.
It's obvious I'm missing something, but I can't figure out what.
Thank you all,
Alvaro Romero.