08-31-2015 08:22 PM
This question is generally for any signal and digitizer but in particular we have a PXI-5105 and are wondering if there is any statistical basis for impriving absolute accuracy via oversampling? We have a signal at 2kHz and of course this scope can do 60MHz. Some on the team have batted around the central limit theorem and show some small improvement but it isn't nearly enough to get what we need.
09-01-2015 09:42 AM
Oversampling distributes the quantization noise across a larger band thereby lowering the average noise floor.
'Oversampling with averaging to increase ADC resolution'
Hope this helps!
~ Q ~
09-16-2015 08:39 PM
So I thought maybe oversampling was the reason for an effect we are seeing which is why I posed the question but perhaps a more direct explanation of what we are doing.
We have a test system that has 2 sets of data aquisition board systems, system A and B. One system, system A, we do an end to end calibration/scaling of the sensors that are connected to that system so we have a voltage as read by that system scaled to the engineering value (pressure, flow, etc).
The other system is the above mentioned 5105 where we route via switch matrix up to 4 signals that are available on the other system to this board. One of us had the idea to calculate the DC offset from system A to system B at run time and then use end to end calibration from A to scale the voltages read via the 5105.
That is, if we want to sample a pressure sensor that system A is reading at 2.5V and the 5105 is reading as 2.4V, we will use this .1V offset and apply it to all other readings we take for that channel in the seconds after we calculate that "real time" offset. We make the basic assumption that both inputs are relatively linear with each other.
What we find is that by doing this the readings we get on the 5105 are usually within 1-2mV of the readings we get from system A. This is remarkbly better than the ~40mV absolute accuracy spec of the 5105.
I don't fully understand this other than we must be converting absolute accuracy into relative accuracy which is always going to be better?
09-17-2015 04:22 PM
The absolute accuracy spec is for worst-case scenarios. Also, accuracy describes how close a measurement is to the real value.
'Measurement Accuracy of a Data Acquisition Board'
http://www.ni.com/white-paper/4517/en/
The specification you are describing seems to be more related to precision, or how close together values are. So, the two measurements might be within 2 mV of one another but they may each be 30 mV away from the true value.
I don't mean to give a lecture on measurmeent fundamentals, but I am not sure you are evaluating your system with the right spec in mind.
~ Q ~