I am new to Labview. I am trying my first project that consist in measuring pressure from a transducer. The output of my transducer is 4- 20 mA and I am using the USB 6229 card.
I am using a 250 ohm precision resistor so the signal converts the 4-20 mA current into 1- 5 V. The problem is that the voltage values that I acquire oscillate a lot: For example for an expected 1 Volt input (equivalent to zero pressure), I read in labview an oscillating signal from 0.095 V and 1.05V. It represents in my sensor an unacceptable variation (about 20 psi). However, when I measure the voltage in the pins from the DAQ module using my multimeter I read a steady signal of 0.995 V for zero pressure.
I appreciate any help with this
Have you looked at this "variation"? Might it have a frequency of 60 Hz (in the US) or 50 Hz (most of the rest of the world)? Do you have a circuit diagram of how you are using the 250 ohm resistor? [Why are you using a "precision" resistor? Are you willing to trust its value? Were you planning to calibrate your setup?].
Am I correct that you are not an Electrical Engineering student? [Sorry about asking so many questions and not giving "the answer" -- it is far better for you to think about what you are doing and figure it out for yourself, maybe doing a little reading on the side if necessary ...].
Although this won't help you reduce variation, it might help you reduce surprise.
How fast does your multimeter display? What is the smallest division in the range you're measuring? Is it possible that it's already low-pass filtering the signal to give a more steady readout (since when using a multimeter, usually an easy to read display is more valuable than a fast update rate (and indeed a fast update rate makes reading much harder!)).
For a physical source of variation I'd look to an oscilloscope or Fourier Transform of the input to see if you can either visually or numerically pick out a dominant frequency (as Bob Schor said, it might be related to your power supplies).
(as Bob Schor said, it might be related to your power supplies).
Or, as my (formerly 3-year-old) daughter used to say, on submerging her ears in the tub and listening to herself talk, "Impedance mismatch!".
No, I am not an electrical engineer. I am mechanical engineer, I have always build my setups using meter but this time I wanted to try something new and challenge my self-using Labiew. When I refer to a precision resistor, I am referring to a resistor with tight tolerance, I think is 0.1%.
I am planning to calibrate my setup using a linear interpolation, using the voltage at zero pressure (theoretically 1 V) )and the voltage at span pressure (Theoretically 5 V). My plan was to calibrate the 0 pressure and span pressure of my sensor using the actual Voltage Values. However I am not having a stable value, but an oscillating value.
Are you looking at the data from a simulated device and not your actual USB device? Simulated devices output an oscillating signal.
Also consider a "running average", in general sample at a fast rate and then take the average of a predetermined amount of measurements.
Every power analyzer I have ever used could do this with user selectable "averaging depths" for exactly this reason.
Hi, I am facing the same problem right now that you were facing almost 4 years ago. I was wondering can you tell me how you overcame that problem. I am also a mechanical engineer like you and I am not too good in this stuff. Your help will be appreciated.