LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

DAQmx custom scale: how do you set MIN and MAX values?

Solved!
Go to solution

Hello everybody,

 

I'm currently using LabVIEW 2010 and experiencing the following problem with DAQmx custom scale: I firstly created a custom linear scale (y-intercept=0, slope=50) and then a global channel which uses that scale. The real signal acquired from a NI-9205 (AI module) is in the range -8/+8 Volts; when I try to read the measure from the channel it seems saturated.

 

I discovered that changing AI.Max and AI.Min through a DAQmx Channel Property Node I can improve the situation but the point is: I was not able to set a range of -1000/+1000 for instance because it exceeds the maximum allowed and I get an error!!!

 

So, what are custom scale for???

 

Does anyone know how to fix it?

There seems not to be any possibility of having a big scaling factor! If I want to convert 1V into 1000V I will never achieve it!

 

Many thanks for the help.

 

B.

Message 1 of 15
(6,835 Views)
Solution
Accepted by topic author Blueyes

 

  HI,

 

  The custom scale will be the function that converts the input voltage to whatever units you may want.  If you use a linear scale it is in the form:

  Data in user units =    m * Vin  + b

  where m is the slope, b is the intercept and Vin is what comes in from the analog input channel.

 

  The MIN and MAX you set are the min and min AFTER scaling.  So if you want 1 volt in from channel Ain to show up as 1000 in the code, you want:

  m = 1000, b = 0

  Then your MAX and MIN is limited to -10000 and 10000 when the AI channel range is +-10V.  Selecting a MAX or MIN outside  this range will cause an error and numbers read in outside this will so saturation.

 

  The 9205 has selectable ranges:  ±200 mV, ±1, ±5, and ±10 V programmable input ranges.   I have not used the 9205 so I am not sure if this range is automatically selected or user set in the module preferences.   Either way, this will effect what the possible MAX and MIN are. 

 

  For example. 

    If you did what you said and used slope = 50, and intercept = 0, AND you had the system set to ±10 volts, then the MAX and MIN that you can set are limited to ±50*10 = ±500.   Thus if you try to put a maximum value of 1000 in there, you will get an error since it has to be within +500 to -500.   Your saturation will occur at ±10 volts (which is the scaled ±500)

     If the range is set to +-1 volt, then your MAX and MIN must be within -50 to +50.  Any values outside this will produce errors.  and if you try to read in voltages above +-1volt (+-50 scaled), it will saturation.

 

  I hope this is clear and helpful.

 

 --Alex--

Message 2 of 15
(6,823 Views)

Thank you. This is not intuitive. You would think a scale is a scale and that min/max is pure voltage to protect the system. Seems like there is space for human error to configure the device to an improper input range.

Message 3 of 15
(5,400 Views)

Good solution. I agree min/max value is not intuitive.

0 Kudos
Message 4 of 15
(5,207 Views)

Hi,

 

reading the LabVIEW help for DAQmxCreateChannel may help here:

  • minimum value specifies in units the minimum value you expect to measure.
  • custom scale name specifies the name of a custom scale for the channel. If you want the channel to use a custom scale, wire the name of the custom scale to this input and set units to From Custom Scale

When you apply a scale then you apply different units. Min/Max follow those different units!

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
0 Kudos
Message 5 of 15
(5,198 Views)

@jprevost wrote:

Seems like there is space for human error to configure the device to an improper input range.


I disagree.  To somebody who just cares about that measurement, it is much more intuitive to put in the range they expect their measurement to be (inches, lbs, etc).  It would introduce more error if you forced them to do the math to get back to the voltage.


GCentral
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
0 Kudos
Message 6 of 15
(5,195 Views)

Hello AlexNCSU,

 

I have a similar question. I am using NI 9203 for my temperature sensor and getting values between 4-20 mA current, whereas my temperature range is 0-150 deg C. Any idea what should be my value for y = mx+b equation. (Assuming it is linear.) 

 

I have also tried creating Map range scaling for this but when I click start option, nothing comes up to the graph. Surprisingly, there is no error as well!!

 

Analog current: 4-20 mA

Temperature range: 0-150 deg C

 

Thanks in advance,

Maitrey 

0 Kudos
Message 7 of 15
(3,906 Views)

Math.  Input range 16mA,  output range 150 deg C.  So m would be 150/16 or 9.375

 

Since the min is 4 mA, that means you need to subtract 4 mA times 9.353 degC/mA,  or 37.5.  So b would be -37.5

Message 8 of 15
(3,899 Views)

Hi RavensFan,

 

Please have a look at the attached snippets. I am getting an error w/ these values,

 

Error -200077 occurred at DAQ Assistant
Possible Reason(s):

Requested value is not a supported value for this property. The property value may be invalid because it conflicts with another property.

Property: AI.Min
Requested Value: 4.0e-3
Value Must Be Greater Than: -37.687500
Value Must Be Less Than: -37.312500


Device: cDAQ1Mod1

Download All
0 Kudos
Message 9 of 15
(3,896 Views)

You said milliamps.

 

Your actual range input is in Amps.

 

So the new math is 150 degC/.016A  or m=9375

and b would be 9375degC/A *.004 A.  or b  = -37.5  (stays same)

 

And like the earlier messages said, the Min and Max range would now be your SCALED units or 0 min 150 max.

 

0 Kudos
Message 10 of 15
(3,892 Views)