LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

help with mean filtering and noise reduction with Mathscript

Hello, I have this:

 

The idea of mean filtering is simply to replace each value in a signal with the mean (average) value of its neighbors. A mean filter is widely used for noise reduction.

Start by adding some random noise to a signal (use the file echo_1.wav or any other speech data file). Then, use mean filtering to reduce the introduced noise. More specifically, take the following steps:

  1. Normalize the signal values in the range [0 1].
  2. Add random noise to the signal by using the function randn. Set the noise level as a control.
  3. Convolve the noise-added signal with a mean filter. This filter can be designed by taking an odd number of ones and dividing by the size. For example, a 1× size mean filter is given by [1/3 1/3 1/3] and a 1×5size mean filter by [1/5 1/5 1/5 1/5 1/5]. Set the size of the mean filter as an odd number control (3, 5 or 7, for example).
 
0 Kudos
Message 1 of 3
(3,102 Views)

None of your points require Mathscript. In fact it would be easy to implement them in plain LabVIEW. This is the LabVIEW forum.

 

It seems you want to simulate filtering. Normalizing the signal between 0 and 1 is not required for filtering (it works equally well for any amplitude or offset, and adding noise to a real signal is not a typical way to remove noise. 😄 It seems the first two steps are to simulate a noisy signal. All the fitering is done with the convolution. Just generate the convolution kernel and use the convolution tools from the signal processing palette to do the rest in basically one step. No text code needed.

 

(f you are really looking for a Mathscript solution, have the moderator move your question to the Mathscript forum. I would recommend agains using Mathscript though)

0 Kudos
Message 2 of 3
(3,082 Views)