LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

NI 9215 How to scale voltages

Solved!
Go to solution

Hi all,

I am reading in a 4 - 20 mA signal through either a 250 ohm resistor to create a range of 1 - 5V or through a 500 ohm resistor which would create a range of 2 - 10V. I was wondering how I would scale these values (in software?) in order to read accurately, I am using a cDAQ-9185.

Kind Regards,
Victoria

0 Kudos
Message 1 of 3
(2,287 Views)

Math.

 

You could also create a scale in DAQmx or Measurement and Automation Explorer.

0 Kudos
Message 2 of 3
(2,261 Views)
Solution
Accepted by topic author Victoria1

@Victoria1 wrote:

I am reading in a 4 - 20 mA signal through either a 250 ohm resistor to create a range of 1 - 5V or through a 500 ohm resistor which would create a range of 2 - 10V. I was wondering how I would scale these values (in software?) in order to read accurately, I am using a cDAQ-9185.


You have a sensor that uses standard Current Loop (4-20 mA), which you convert to a voltage in the range 0-10V by measuring the voltage drop across a 500 ohm resistor.  So far, so good.  Now you ask about how to "read accurately".  What does that mean?

 

What I assume you want to know is how to get the reading to be in "Units of Device Measurements", such as Temperature (in, say, °C) or Flow (in, say, L/m).  This, in turn, depends on both the Sensor and on how much you trust it (quick -- who is famous for the quote "Trust, but Verify"?).

 

I've just dealt with this, myself -- I have some Flow meters that have the property that (a) Current is linear with flow, (b) the Maximum current (20 mA) corresponds to a flow of 200 L/m, (c) the Minimum current (4 mA) corresponds to a flow of 0 L/m, and (d) the 500 Ohm resistor is within 1% of 500 Ohm.

 

So the first question is "Do I trust these values"?  [I once had some Master's-level engineering students working with an accelerometer that said it was linear, with a bias voltage of 1.5V ± 10% and a gain of 0.3 V/g ± 10%, and wondered why they were getting "silly" readings of acceleration (like not getting exactly 1.5 V when the accelerometer was stationary, regardless of its position).  They'd apparently never heard of doing a calibration of their instruments ...].

 

But what you really want, I suspect, is to be able to convert your Voltage readings (which are already a "conversion" of your device's Current output) to units that your Device uses (Temperature, Flow, whatever).  So let's assume you know (or have checked) the Calibration.  So is it linear?

 

DAQmx (fortunately) has the ability to scale its Inputs according to "rules" that you give it.  The best way to learn about this (including how to specify the Scale Factors) is to hook your device up to your PC running LabVIEW and DAQmx, open MAX, and open a Test Panel for your device.  With the 500 Ohm resistor in place, have the current flow through the resistor and measure the voltage drop, using the 0-10V scale.  Now, in MAX, explore Scaling (I'm going to leave it at that -- use MAX Help if it isn't clear, but don't be afraid to experiment and see if your results make sense).  Once you have figured it out, save the configuration as a Task in MAX.  When you go to write your "real" DAQmx code, you should use the "four-function DAQmx pattern" -- Start Task, DAQmx Read (inside a While Loop, probably), Stop Task, and Clear Task.  Wire a constant to the Start Task function, click the little triangle symbol, and it will show you all the Tasks you've saved in MAX (you'll probably only have one, the one you made a few sentences ago).  Connect all the DAQmx functions at the Error and Task lines, and you've got the basis of an Acquisition program.

 

To learn more about this, do a Web search for "Learn 10 Functions in NI-DAQmx and Handle 80 Percent of your Data Acquisition Applications" (or something close to those words).

 

Bob Schor

0 Kudos
Message 3 of 3
(2,251 Views)