LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

eliminating noise from analog input signals

I am using a mean function to reduce high frequency noice from the analog input signals. I am required to sample at a maximum rate for the DAQ PCI-6052E which is 333K samples/sec for a long period of time - which require continuous acquisition setup. Please see attached program.

The Mean function works very well to reduce noise for low frequency signal such as 0.0154Hz. However for higher frequency signals, the number of data to be averaged is much smaller than the sampling rate and when AI Read.vi only inputs a smaller number of data to be averaged from the buffer, the buffer runs out of space very quickly because the hardware is acquiring at maximum rate.

Can anyone suggest another method to reduce noice in a r
eal-time acquisition mode and over a long period of time? Would filter do the trick?


Gaston
0 Kudos
Message 1 of 2
(2,403 Views)
Hello;

When averaging doesn't do the trick, the best bet is to insert a filter in the system.
I'm also attaching an Application Note that talks about noise, so maybe you can extract some valuable information from there to help you out.
Hope this helps.
Filipe
0 Kudos
Message 2 of 2
(2,403 Views)