I'm using a NI 9205 differential wired to measure pressure transducers which input a 0 to 100 mV signal. My FPGA VI outputs these values to the main VI in volts (I cannot find a way to have the the FPGA VI output in mV). In the code the voltage is multiplied by a factor to convert the voltage to a usable pressure value. I'm seeing noise on the card and when the values are multiplied, the noise gets exponentially larger. Is there a LabVIEW function (signal processing) that will reduce or eliminate the noise? Or a function that will scale the signal without exponentially increasing the noise?
This is really strange. The 9205 can be set to accept various ranges of input voltage (such as ±1 V), and should have noise on the order of tens of microvolts at this range. You failed to include any code (please, no pictures of code, include the VIs), so we can't comment on "what you did". What Chassis are you using? [I'm assuming its a Compact-RIO, since you mention FPGA]. With differential A/D and proper grounding, you shouldn't see much noise. Have you tried looking at your signal with an oscilloscope? Know any Electrical Engineers (or Electrophysiologists)?