Signal Conditioning

cancel
Showing results for 
Search instead for 
Did you mean: 

advantages/disadvantes of using miliVolt rather than miliAmps signals?

Hi

I have a coice of using a sensor that will give me either miliVolt or a miliAmp signal. Which one should I use and why?

Thanks
kris
0 Kudos
Message 1 of 7
(4,419 Views)
Kris,

It will depend on the distance/wiring between your sensor and the DAQ-device. If these leads are short and for experimental use voltage is OK.
Currents are used mostly in industrial, noisy environments or with larger cable-lengths.
A standard DAQ-device mostly can only read-in voltages so you need an extra shunt-resistor. Sometimes (like in certain Fieldpoint modules) this resitor is build in.

Patrick
0 Kudos
Message 2 of 7
(4,419 Views)
Hi Patrick

Thanks for your reply.
I found that with a miliVolt signals it is very easy to get groundloops. That is why I thought about aquiring miliAmp. Do you know of any disadvantages of using miliAmp signals?

Thanks again
Kris
0 Kudos
Message 3 of 7
(4,419 Views)
Kris,

I think only the extra heat generated in the sensor to drive the output_current. As long as the sensor is immume to the extra self-heating it should not be a problem.

Patrick
0 Kudos
Message 4 of 7
(4,419 Views)
Hi Patrick

If I understand it correctly the biggest problem with a miliVolt signal is a voltage drop along the wires. Thus if the wires are too long the Voltage drop can be quite large. Also, miliVolt signal is susceptible to noise and ground loops.

On the other hand with a miliAmp signal you don`t have these problem as a current does not decrease or is affected by other high voltage cables running along (like in an industrial environmanet). But the sensor itself can heat up in order to generate the requied current.

Am I correct?

Thanks again
Kris
0 Kudos
Message 5 of 7
(4,419 Views)
Kris,
Voltage drop is not really a problem, but noise and groundloops are.
On the mA signals you are correct that the noise- and ground problems are less; but they cannot however be completely removed.
There is another alternative that uses a differential voltage output. The advantage is low power and noise immunity, but however it is not commonly used in sensors. I guess because it takes an extra analog invertor and an extra wire.

Patrick
0 Kudos
Message 6 of 7
(4,419 Views)
Voltage drop due to (excessive) resistance of the wires is not really a problem as long as you use a high impedance measuring circuit. Voltage drop occurs if there is a significant current flowing through the wires. Typically the input impedance of a ADC is much bigger than 1 MOhm, so wire resistances up to 10 kOhm will cause less than 1% error.

The problem lies in the necessary high impedance input which is prone to noise problems.

A current signal can also be affected by noise signals. The ideal current measuring circuit has an input impedance of 0 ohms, and since the source impedance of noise signals is rather high, the noise will 'break down' due to low input impedance.

It is not necessary (and not very advisab
le) to use a plain resistor as a current-to-voltage converter. The resistor will increase the input impedance higher than necessary and will force the source circuit to generate higher voltages. It is not difficult to build a simple current-to-voltage converter with a op-amp (transconductance amplifier).

It depends on the sensor output circuitry whether it will generate heat or not. If it has a true voltage-to-current converter very little heat will be generated.
0 Kudos
Message 7 of 7
(4,419 Views)