I want to know, if it is possible to make a kind of "autorange" for the analog inputs of the SB- RIO 9636 board.
The mean is to increase the accuracy, if small voltages are used.
But I am not sure, if the will be something destroyed, if I apply a high voltage in a sensitive range.
Do you have any hints or tips?
Solved! Go to Solution.
Not with just the sbRIO 9636, you would need some external circuitry to convert whatever your input voltage is to the 0-10V (or 0-5V...I can't remember which) of the sbRIO. In any case, you would need to have at least some indication of which range is in use along with your measurement so you can scale it correctly in software.
There's some information on these very forums about that here: http://forums.ni.com/t5/Multifunction-DAQ/How-to-build-an-auto-range-measurement/td-p/253778
More of an EE question than a LabVIEW question but there's probably some others here with more wisdom in this domain 🙂
The sbRIO-9626 and sbRIO-9636 has the ability to set the gain of the programmable gain instrumentation amplifier (PGIA) within the LabVIEW FPGA Block Diagram using the 'Voltage Range' Property Node.
You may be used to setting these settings in the LabVIEW Project by right-clicking the AIO and selecting Properties then changing the Voltage Range before compilation. When you set this property within LabVIEW FPGA, it will override the LabVIEW Project setting once called.
You can implement in your LabVIEW FPGA code to first use +/- 10V (widest range) and check the voltage for each channel to see if the voltage range can be adjusted to a smaller value. For example, if you are using the +/- 10V range but are reading 1.156 V for a specific amount of time, then you can change the voltage range of that channel to be +/- 2V range. It may be benefecial to only autorange when you press a button or are needed so there is not the overhead of checking and adjusting before actually taking the readings.
Each channel can be independently selected as the PGIA is downstream of the multiplixer as shown in Figure 22 of the Operating and Instructions Manual.
The manual also states the input range and resolutions for the sbRIO-9636 (+/- 10V, 5V, 2V, 1V). I think Sam_Sharp may have been thinking about the sbRIO-9623 or sbRIO-9633 which has a fixed input range of 0-5V.
The LabVIEW Help also has more information on the Voltage Range Property Node:
the input, that I am using is a voltage. The SB- RIO has 4 voltage input ranges (+-10V / +-5V / +-1V / +-0.2V).
To get the best accuracy, I want to use the smallest possible input range. So my idea is: when the Bit value is to small, the i want to switch to the next lower input range and when it gets to high than go to higher input range.
All that during runtime.
But I don't know if it is possible and if there could be destroyed something, if the voltage is to high for the small range, e.g. applying 10 V to the +-0.2 V range?
We may have been posting at the same time you missed my post but it is possible to change the voltage ranges during runtime. It is actually +/- 10V, 5V, 2V, and 1V and not 0.2V.
You are right. I looked to a wrong datasheet.
Final question: what will happen if I appply 10V when the +-1V range is selected. Can in this case something been damaged?
The maximum working voltage of each range is also listed in the manual, here:
Anything outside of those ranges is not supported or recommended.