Recently, I performed some simple strain measurements using two 350 ohm quarter bridge strain gauges, that were appropriately completed. My input voltage was 10 V. After measurements were complete, I noticed that my measurements were roughly half of what I was expecting. After thinking it over, I realized that I had exceeded the power limitations of the Ni 9237; however, now I'm stuck with data that I'm trying to salvage. Of course, the only way to salvage the data is if I, roughly, know what excitation voltage the 9237 was using. Thus, my question is as follows: How does the 9237 decrease excitation voltage to account for the 150 mw power requirement? Does it drop the excitation voltage to the next lowest pre-defined excitation voltage that will be within power requirements, or does it drop the voltage only enough to maintain exactly 150 mw (or some other, constant, calculable value).
I apologize if this is in the incorrect forum.
Thank you for your time.
Actually, I'ved managed to figure this one out myself; however, now I've got a follow up question:
When you set the T value to true for a strain gauge/load cell (meaning that it will automatically divide by the exication voltage before it outputs the data), is it dividing by the excitation voltage that was initially set by the user, or by the voltage that the module has adjusted to?
This is a good question. The ADC in the 9237 will use the actual excitation voltage as the reference NOT necessarily the value specified by the user in the task.
I am not getting 10 V from my NI 9237. I have connected it with RJ 50 cable and measuring the voltage across (blue and purple wire ). There is no load cell connected. I am only getting 2.5 V.
Please let me know.
Are you configuring the hardware in a MAX task? in LabVIEW?
How have you set-up the hardware, in software, to output a Voltage?