LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

filtering

Solved!
Go to solution

Hi, All,

 

I have a data set (1D array of 500 values) and I am using the LV's basic filter (smootihing with moving average, rectangular, half width = 1) to smooth the noisy data. As you can see on the attached pic, the smoothed curve in red looks shifted by 2 pixels with respect to the original data in white. If I take the difference btw the original and the filtered data as the measure of the noise (green curve), then it will be exaggerated due to this shift. One can write a code to compensate for this, but I wonder if there is a way to correct it in LV. Also, what exactly causes this shift?

 

The second pic shows the sub vi for doing the filtering, etc.

 

Thanks!  

0 Kudos
Message 1 of 3
(2,406 Views)
Solution
Accepted by topic author femtovahan

The moving average smoothing itself is a low pass filter. As any digital filter, it has phase response. Remember, phase in nature is delay in time. There are some filters featuring zero delay/linear delay but apparently the smoothing filter is not one of them. When you configure the filter, select transfer function in the view mode, you can find phase response in the second graph. Notice that it has a non zero phase delay in most frequencies.

 

You can also increase the width of the filter and you should be able to watch the delay increase as well. So, not much you can do with it other than realigning the signal after filtering if you like.

0 Kudos
Message 2 of 3
(2,392 Views)
Thanks, I agree, it is related to the phase delay, which scales with the window size. Using the LV digital FIR (lowpass) seems to do a better job.
0 Kudos
Message 3 of 3
(2,378 Views)