Multifunction DAQ

Showing results for 
Search instead for 
Did you mean: 

Calculating the accuracy of the NI-9237



I just finished calling one of the salespersons, and I was dissappointed to hear that the NI-9237 isn't going to give me the accuracy that I wanted. I just want to confirm this.

 So I am using multiple Strain Gage Load cells and I believe a multiple of NI-9237 DAQs are required as well as a cDAQ.

The specs. of the load cells are (max output of 4mV/V, max load of 5000lbs, and max. 10VDC).

The NI-9237 has a max. voltage range accuracy of 0.038mV/V. Does that mean that regardless of the range the DAQ is only going to detect 0.038mV/V of change.

For example: 

Assuming that I use a 10VDC. The max output would be 40mV. Therefore the increments that could be read from the DAQ would be (0.038/40)*5000 = 4.75lbs ?


What I thought was that knowing that the NI-9237 has 24 bits of resolution I conclude that the smallest increment of lbs that I could read would be (40/(2^24))*5000 = 0.012 lbs


If so then what does a max. Voltage Range accuracy mean?


Thank you!

0 Kudos
Message 1 of 7


Accuracy and resolution are 2 VERY different things.


In order to calculate the accuracy, you will need the accuracy of the DAQ device (which you have seen from the spec page to be 0.038mV/V), the type of bridge completion (Quarter, Half, or Full), and the Gage Factor of your particular strain gage.


Refer to this white paper:


Resolution is a metric of the ADC, which as you noted is a 24-bit ADC and if the range of lbf for that strain gage is 0-5,000 lbf then the calculation is 5,000/(2^24) = 2.98*10^(-4) lbf or 0.000298lbf.

0 Kudos
Message 2 of 7

Thank you for the reply. 


So in that case what is the significance of the max. Voltage Range Accuracy? How does it affect my readings?

0 Kudos
Message 3 of 7


That is the accuracy of the 9237 within the the 1-year calibration specs for the maximum range and reading without considering noise.


The maximum excitation range is ±25mV/V (or a range of 50mV/V) and the maximum reading would be 25mV/V (or negative 25mV/V, but we'd take the absolute value of that anyway).


And so [(25mV/V)*(0.05%)] + [(50mV/V)*(0.05%)] = 0.0375mV/V or 0.038mV/V.

Message 4 of 7

Thanks again for the reply.

So everywhere I've looked I found that the accuracy is calculated by a percentage of the range of the cell. Why is the accuracy of 25mV/V added to the range accuracy (50mV/V)?


0 Kudos
Message 5 of 7



There is an accuracy associated with the range in which we are taking measurements as well as the magnitude of the measurement itself. The closer you get to the edge of your range, the less accurate your measurement becomes.


That is why the formula for calculating accuracy uses both the range and the highest measurement in that range to give you the "worst case" accuracy.

Message 6 of 7

Thank you!!

0 Kudos
Message 7 of 7