My current setup uses a cDAQ 9172 with three 9263 analog output modules, one 9211 thermocouple module, and one 9205 voltage input module. The 9263 modules and the 9211 seem to be working well, but I'm having some issues with the 9205. Its purpose is to report the output from two pressure transducers (differential mode), but the measurements are quite choppy. For example, one of the transducers outputs a steady voltage of 0.4963 V at atmospheric pressure, as confirmed by a multimeter hooked up right at the 9205 module. However, the LabVIEW reading bounces all over the place (+/- 2% of the true value, much worse than the achievable accuracy reported in the 9205 manual). This is visible in the top waveform chart of my screenshot. The second waveform chart plots the output from the second transducer. It's a little cleaner but still about +/- 1%. The third waveform is from a regulated power supply.
Is the 9205 picking up noise in the transducer signal that the digital multimeter can't register? This module has been damaged and fixed by NI twice, maybe it's not operating properly? Furthermore, is it odd that all differential channels after the ones where a voltage is applied read roughly that value? For instance, 0.4963 V is applied across ACH0 - 8. ACH1 - 9, ACH2 - 10 ... ACH5 - 13 all read roughly 0.5 V. The 10.43 V reading is applied across ACH6 - 14, and subsequent differential channels read this value until the 5 V signal is applied at ACH19 - 28.
Thanks for any help!
is it odd that all differential channels after the ones where a voltage is applied read roughly that value?
No, that is normal behaviour!
The NI9205 uses one ADC with a MUX. When inputs are floating you basically get the reading from channels used before: it's called "ghosting".
Solution: only read the channels where you have connected a signal source to.
Is the 9205 picking up noise in the transducer signal that the digital multimeter can't register?
Thanks for the info. I've made an interesting observation that I hope sheds some light on the issue. When I first turn on the DAQ chassis and run the VI, the signal from the 10 V transducer signal is quite stable. Then, after a few seconds, it begins to diverge and the oscillations slowly increase. This is observable in the following attached screenshot on the lower waveform chart. At around the half-way point on the chart, the oscillations begin to increase. Note that this is with no changes whatsoever to the physical surroundings. If I stop the VI and start it without turning off the DAQ chassis, the plot picks up with the same messy signal.
The ADC and electronics in a multimeter will be sampling over a longer time and averaging for the display.
The acquisition and conversion time in your CDAQ module will be much lower and more true to life.
I suspect that the fluctuations are real, and outputted from the sensor.
Also please bear in mind the scale of your graphs.
The upper graph Y scale shows a range of 0.4 volts
The lower graph Y scale shows a range of 0.014 volts
Accordingly, the noise is going to look a lot more prominent on the bottom scale, but I suspect the amount of noise will be the same in both cases.
edited to add: Right click on the Y scale of the graph and turn Autoscaling off 😉
Good point regarding the Y scale; I had set it up that way so that each chart showed approximately +/- 2% of the multimeter value.
Is it not concerning that the program initially reads a smooth voltage, then the signal progressively degrades into oscillations after a short period of time? This would indicate some sort of problem to me, although I have no idea where!
Another piece of information I forgot to include: I haven't done anything with the COM port on the 9205 module. I assumed this would not be necessary as I'm always reading differential voltages.