Hi Bruce,
Let's look at the
FSH00682 for example. The
spec sheet
specifies the rated output is 2mV/V nominal. The excitation is 18V max
and the rated output is 5000in-lb. This means that every volt of
excitation gives 2mV at the rated output. With the 682 for example, if
you excite the sensor with 10V, you will read 20mV when the torque is
5000in-lb. John at Futek told me that the relationship is nearly
linear, but the actual calibration curve ships with the sensor.
You wrote that your sensor is outputting 1.6125mV/V, I'm assuming you
got this value from your calibration sheet. If you were using the
AI-110 to measure voltage off the 682 directly, you should use the
±60mV input range for the AI-110 channel. If you measured 0V, the
torque at the senor is 0in-lb. If you excite the sensor with 10V and
measure 16.125mV, the torque at the sensor is 5000in-lb. If you use y
= mx + b, where m = delta_y / delta_x and b is the 0-torque calibration
point, you find that every 3.225µV = 1in-lb of torque. 16-bits of
resolution over ±60mV gives you resolution of 1.83µV, so you can
measure torque changes slightly less accurately than 0.5in-lb. Please
let me know if this helps, and if you have additional questions.