09-17-2024 10:28 AM
Hello to you all,
I need to measure some prototypes to see if they comply with the requirements. I did try to match the Ni hardware specs to the prototype requirements. The prototypes need a lower Ni gain error and offset error.
As I understand, Ni uses a and b in the following formula to store their calibration data:
ax + b = value
This results in an remaining offset error (b) and gain error (a). For instance the Ni-9203 has a gain error of +- 0.04% and an offset error of 0.02% when calibrated.
My question is what can I do to get a better accuracy? I can imagine one 'a' value is not enough in this case but I can use a calibration method measuring the inaccuracy over 20 points e.g. at 0, 1, 2, .......... 19, 20 mA. And use this table in combination with some interpolation to achieve better accuracy.
However I can also imagine this method will not work and the table needs to be refreshed with new calibration data every hour week or month.
can someone please tell me how its normally done?
Thanks for the help!
09-17-2024 11:24 PM
Even if you do an advanced calibration, the challenge you will face is the drift in accuracy due to temperature variations and just inherent drift. So, the only option would be to do a calibration in short intervals to get a better accuracy.
NI's gain/offset errors are guaranteed across temperature ranges (check specification) and are more of a worst-case scenario to account for manufacturing differences; the actual that you may observe will be typically better than that.
If you need a high-accuracy current measurement, you can use Voltage input modules and an external shunt. However, you don't know the stability and accuracy of your external shunt, which becomes a bottleneck in defining the overall accuracy.
09-20-2024 09:34 AM - edited 09-20-2024 09:41 AM
To qualify your instrument for a lower (than spec) uncertainty you 'just' have to:
Do a lot of calibrations (mean comparisions) with another intrument/standard with lower uncertainty, while
(As Santosh noted:) Define thighter boundaries (mainly more stable temperature)
Do it for (at least, more is better) over one recalibration periode defined in your QMH, (see by using shorter periodes, lower uncertainties can be optained, some specs of high quality equipment usually used as calibration standards have numbers for 24h , 90d, 1y ... guess why)
Document it, write and sign a new spec. for your device and hope the assesor will accept it 😄
.
Finally, for a lot of devices you migth find that this is not possible, because the drift due to aging is not predictable in many cases, inherent noise would need a too long measuring periode, .... (guess what the manufacturer do 😄 )
.
Usually better speced instruments are on the market.... so spend money on time or hardware