08-25-2011 05:13 AM
I have a DAQ card 6036E. I am trying to aquire very low voltages (of the order of 0.004mV and higher) using this card. I have connected the inputs in differential mode. The output(Mean DC) of the sensor observed using this card and labview is not same as seen in 6-1/2 digit multimeter (make: FLUKE). But if I perform self calibration of the card, the output of multimeter and 6036E becomes same. But the problem is after 5 to 6 runs the values again starts differing. Maximum difference in multimeter and 6036E is of the order of 0.06mV. Again if I do self calibration, the output of multimeter and card becomes equal. Moreover, the output seen in the card for same condition at each run also varies by a maximum value of 0.01mV. I have connected the sensors using shielded wires, kept all the AC lines away from the DAQ card so as to avoid noise, but still the problem persists. I have tried all the combination of No. of samples and rate. Mostly I aquire the data for 1 second at a rate of 500Hz. In the DAQ assitant I have kept the range as +/- 5mV.
Do I need to perform self calibration at each run?? Or is there any way to remove the buffered values at each run. Even if my instrument is switched off the output observed from the card is 0.06mV (mostly), but if I do self calibration, initially the output becomes 0.001mV or lesser whereas multimeter shows 0.000mV.
why the outputs from card are never a constant value??
Please help me.
08-26-2011 08:11 AM
Hi,
The Minimum Voltage range Accuracy of 6036E is 0.061 mV. Like you said the Maximum difference in multimeter and 6036E is of the order of 0.06mV. This is clearly because of the minimum voltage range accuracy limit of 6036E.
Trust this helps.
08-26-2011 08:31 AM
Hi Kanchan,
Thanks for your reply.
When I do the self calibration of card, the multimeter and the card shows same values. But after few runs the output of multimeter and card starts varying. Its not like initial difference is 0.06mV. It varies with time (sometimes it is 0.01mV ...0.04...0.06 etc). Again after sometime if I do self calibration the difference vanishes. Is there any option in labview 7... which will drop any buffered values in the card?
Do I need to self calibrate the card before every run or any option is there which I am missing?
08-26-2011 10:48 AM
You could try scanning in a grounded channel, and subtracting its readings from your measurement. In other words, do autozero, just like your DMM does behind the scenes.