Multifunction DAQ

cancel
Showing results for 
Search instead for 
Did you mean: 

Difference between Test Panels and program

Hi All,
I'm writing a program for measure voltage samples. I have 13 analog input channels. Each of one of them has a signal between 7.3V and 0.0V.
When the input signal value is about 7.3, my program is reading 6.8 V.
When the input signal is about 0.0V is reading 0.55V (very high!!!).
With the same signal,  I have executed the test panels, and the panel shows 7.42V and 0.0002V.
 
I don't understand why the test panels get a different measure!!!
 
My program uses continous sampling, 2000 samples per second. The measure is done in RSE mode.
 
It's obvious I'm missing something, but I can't figure out what.
 
Thank you all,
Alvaro Romero.
0 Kudos
Message 1 of 3
(3,033 Views)

Hello,

i think i have the same Problem.

http://forums.ni.com/ni/board/message?board.id=250&message.id=22427

Do you get some results ?

Regards Lutz

0 Kudos
Message 2 of 3
(3,023 Views)
Hi Alvaro

 Let me check I get this right

  When reading the values from MAX (test panels), the readings are correct but when reading from your program, values are different from what you expect. Is that right?

  When you are talking about 2000 samples/sec and RSE, is that configuration used both in test panels and your software? what software are you using (LabVIEW, LabWindows....)?

  How are you inputing your signals? pe. CH0 7.3V and CH1 0V and reading each at 2000 samples/sec on the same task?

 Regards

Javier Gutiérrez
application engineering
0 Kudos
Message 3 of 3
(2,986 Views)