09-11-2009 03:17 PM
Greetings,
I have a usb 6218 DAQ and I am trying to measure a 0.1mV signal. In my system I am using LabView Sigmal Express ( and also DAQ test panels). I have the signal wired in the differential mode. I have the input range at 200mV. I have a 1K resistor between V- and GND. (I am using AI18/AI 26) The reading is unstable and negative. When I source a known voltage from a thermocouple calibrator any voltage below 1mV it gets unstable. Accordingf to the specs the sensitivity is around 6uV, althought the range accuracy is 88uV (not sure I totally understand that). Anyway, is my signal too small for this DAQ?
Thanks.
09-14-2009 11:35 AM - edited 09-14-2009 11:36 AM
Hello psusteve,
Thanks for using NI forums. You are correct that the sensitivity of the 6218 is around 6uV when the range is set to +/- 200mV. This is the code width of your signal, meaning that for each step of your ADC you will be within increments of 6uV. This is calculated by taking the range of your signal and dividing it by 2^16 (bits of resolution). In your case, the range is 400mV. So the range accuracy is the possible error of the range setting. This is just something to take into consideration when calculating the code width in the formula mentioned above.
As far as the noise level in your signal, I would take a look at the Developer Zone article: Field Wiring and Noise Considerations for Analog Signals. One thing I noticed is that you have the bias resistor for the AI- input but not for AI+. In the article I mentioned earlier there is a section titled "Measuring Floating (Nonreferenced) Sources". In most differential measurements, two bias resistors are recommended to help reduce erratic signals. Make sure that you don't need two resistors as well. Also, thermocouple measurements are generally noisy anyways, as mentioned in the 621x User Manual, and could require some signal conditioning before measuring the signal. As far as the negative signal, it could be something as simple as switching the + and - inputs to the 6281. Please take a look at the article and let me know if you have any further questions.
Regards,
09-14-2009 03:04 PM
Thanks for your reply Brandon,
I'll try a second resistor. To make myself clear, to test the measuremnt I am making I am sourcing a known voltage from a Fluke 701documenting Process Calibrator to the 6218. When I source a voltage of 1mV the 6218 rerading varies between 0.6-1.0 mV. When I source a voltage of 0.1mV the 6218 gets unstable and even gives a negative reading. Can the 6218 measure small signalsbelow 0.1 mV?
09-15-2009 07:13 PM
The absolute accuracy of the 6218 is 88uV. This means that your measurement error should not be larger than that amount, and since you are able to read as low as 6uv with a range of +/- 200mV, you shouldn't be seeing so much fluctuation in your signal. Also, I'm concerned about the signal source. You mentioned that you were seeing fluctuation of about 400uV, which is way out of spec (88uv). This leads me to believe that you have a possible noisy signal or some bad connections. You mentioned that you tied a resistor to ground. Which ground are you referencing (mainly wondering if the Fluke device has it's own ground and if you are tying to that or just going off of the 6218)? I would look into possibly obtaining another signal source just to check if that's the problem.
Another thing I was wondering is if your signal fluctuation is always 400uV. How much is the signal varying when you have a signal of about 100uv? If you source a signal of 10mV do you still see the same fluctuation? Sorry for all the questions. I'm just trying to narrow down what the problem could possibly be.
Regards,
09-16-2009 12:45 PM
Hi Brandon,
Thanks again for your input. Here's an update. I connected 22K ohm resistors between ch18/AI GND and ch26/AI GND. I connected the Fluke 701 to ch 18/ch26 and open DAQ MAX (version 4.3) test panels. The 6218 is connected to earth ground ( AI GND) and the Fluke is battery operated. I set the input range to +/- 200 mV. I then sourced 200mV from the Fluke and read from the test panel 200mV. I then sourced 100mV, 10mV, 1mV and 0.1mV and read 100mV, 9.7mV, ~0.8mV (jumpy reading jumping between 0.5mV and 0.9mV) and ~ -0.2mV. For the 0.1mV signal I had to switch from "On Demand" (too jumpy) to "Continuous" with "Samples to Read" set to 10,000. I then removed the Fluke source and shorted the channels (18/26) with a short jumper wire. I read ~ -0.3mV. I then connected ch 26 directly to AI GND (eliminated resistor) and got the same -0.3mV reading. I moved the short to another channel (23/31) and got the same results. I tried to shielding and physically moving the 6218 to a different location and that made no difference. For some reason there appears to be around an -0.3mV offset. Thanks again and I appreciate any insight that you can offer.
Steve
09-17-2009 06:52 PM
Hi Steve,
That is some pretty strange behavior. Have you tried taking an FFT of your signal to see what frequency that .3mV offset is at? I just want to verify that you're not seeing 60 Hz from some ambient light source.
Regards,
09-21-2009 12:23 PM
Hi Brandon,
I resolved the issue today. I removed the 6218 from the lab and tested it im my office to see if uit was a room niose issue. The 6218 gave the same -0.3mV offset. I called tech support and they had me run a self-calibration. This eliminated the offset. Now with averaging I should be mable to measure an 0.1mV signal. Thanks again for your input.
Steve