I am using NI 6225 (USB) DAQ to measure voltages from 8 thermometers, on differential analog input channels. The expected signal from all the thermometers is about 10 mV, so the range is set from -200 mV to 200 mV. The thermometers are RTDs with negative temperature coefficient of resistance which are as high as 12000 ohms. Since I need to make multichannel measurements from such high impedance sources, I want to keep the interchannel delay as large as possible so that the ADC has sufficient settling time. The channel sampling rate I intend to use is 1000 Hz. The largest default interchannel delay of this device is 14 microseconds, but I am adjusting it to 125 microsec (by setting conversion rate = 1/(sampling rate * no. of channels) using DAQmx timing property node VI.
Now before I actually sample the thermometers, I wanted to check if I am getting correct interchannel delays. So I shorted the 8 AI channels (connect + to - pin) and run my VI. Timing wise, the DAQ is functioning as expected but I am running into another problem. Even if every individual channel is shorted, their measured outputs go out of the selected range i.e outputs saturate at 214 mV. To test further I set the channel sampling rate to 2000 Hz, the interchannel delay reduced to 62.5 microsec (as expected), but the voltage reading is about 140 mV which is incorrect. Going further with 5000 Hz sampling, the delay becomes 25 microsec as expected but now the voltages read close to zero (correct !). Similar thing happens when I sample at 10000 Hz. Delay becomes 12.5 microsec and voltage reads near 0.
During my actual run, I dont need to sample at more than 1000 Hz which will give ample time (at least 125 microsec) for the ADC to settle even with the high resistance sources. But I am currently running into the problem described above. Does running multichannel measurements with large interchannel delay cause such a problem? Kindly help...
Where is this 114 mV signal showing up? Is it on one channel or all of them? It also might be helpful if we could see your setup, so a screenshot of your code could be helpful.
Attached are a picture of my DAQ and a screenshot of my LabView program. The 214 mV signal appears on all eight channels. It goes away as soon as I connect -ve pin of any active channel to ground. On disconnecting this -ve pin from ground, the signal slowly goes back to 214 mV...
Here are some relevant parameters from the 6225 specifications:
Input impedance: >10 GOhm in parallel with 100 pF
Input bias current: ±100 pA
Max working voltage (signal + common mode): ±11 V of AI GND
Since the inputs have such high resistance, you will need to provide a return path to AI GND for the input bias current or else charge will accumulate. Once the accumualted charge exceeds ±11 V of AI GND, you'll start running into accuracy issues when the input protection circuit kicks in.
NI's Field Wiring and Noise Considerations for Analog Signals covers this to some degree--in particular you should at least take a look at Table 1.
The convert clock rate affecting the behavior is a very interesting academic discussion that somebody more knowledgeable than myself would have to chime in on; however, it should be a non-issue once you resolve the wiring issue.
Perhaps thats why the voltage again drops near to zero if I connect a -ve pin of any AI channels to AI ground