LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

In Labview 8.5, what happens if the signal input exceeds the signal input range set by the DAQ Assistant?

Solved!
Go to solution

Hello all,

 

This should be a pretty simple question, but I can't seem to find the answer on-line and don't currently have the capabilities to test this:

 

I'm using LabVIEW 8.5 and have a VI that imports sensor data through the DAQ Assistant. In the configuration tab there is a signal input range. What happens if my sensor exceeds this range? Will I get a warning? Will the value default to the maximum (or minimum)? I was interested in writing in some code to display an error as I approach the limits of this range, but was unsure if I also needed to include some code to display an error if the range is exceeded as well.

 

Thanks for the help,

Tristan

0 Kudos
Message 1 of 6
(3,944 Views)
Solution
Accepted by topic author twolfe13

Hello Tristan,

 

The behavior depends on the range you choose and the device you are using.

 

If you are using a device with only one valid input range, we will use this range even if you set a smaller minimum and maximum in the DAQ Assistant.  Thus, if your device only supports ±10V and you set the range to ±8V, you will still continue to get valid data after your sensor exceeds 8V until you approach 10V.  Once you reach the limit of the range of your device, the output will "rail" and just return the maximum value until the signal drops below the maximum again.

 

Note: A device that is nominally ±10V usually has some overshoot (like ±10.2V) that is typically specced in the manual.

 

However, if you are using a device with multiple input ranges then things get more complex.

 

The NI-DAQmx drive will pick the smallest range that fully encompasses the range you choose.  So, suppose your device supports the following input ranges: ±0.2V, ±1, ±5V, and ±10V and you choose 0V - 3V as the range in the DAQ assistant.  The NI-DAQmx driver is going to look at your input range and the list of input ranges that your hardware supports and choose the smallest that encompasses the full range you set.  This would the ±5V, because that's the only range that contains up to 3V.  As a result, any input signal between ±5V will be returned and any outside this range will "rail" to either the maximum or minimum value.

 

We do this because using smaller ranges make more effective use of the resolution of the ADC.  Thus we try to use the most efficient range based on what you request without picking a range that will cause you to miss data.

 

Let me know if I can clarify this further. 

Seth B.
Principal Test Engineer | National Instruments
Certified LabVIEW Architect
Certified TestStand Architect
Message 2 of 6
(3,940 Views)

Fantastic explanantion! Thank you very much.

0 Kudos
Message 3 of 6
(3,934 Views)

Thanks for this explanation, it has been really useful to me. Just a question, I have a voltage task with a  custom scaling. Should the signal input range in the form be set in voltage or in scaled units? As example: the accelerometer output signal is +/-5V, and the scale is linear with Y-intercept of 0 and slope of 10. So should I set the signal input range +/-50?

 

thanks

 

Serbring

0 Kudos
Message 4 of 6
(3,821 Views)

This article should answer your question: http://digital.ni.com/public.nsf/allkb/BB29A9148CF2C6E48625766C006966A5

 

Regards,

Brice Sorrells

Systems Engineer

National Instruments

www.ni.com/support

0 Kudos
Message 5 of 6
(3,801 Views)

Thank you very much, that was very helpfull!

0 Kudos
Message 6 of 6
(2,135 Views)