10-05-2010 03:26 AM - edited 10-05-2010 03:28 AM
Hi guys,
I am new with signal filtering in Labview so I have a (what I believe it is) a basic question. I sample analogue data (AI) every 25ms and directly perform a PID on it. I have noticed that because my data is relatively noisy my PID controller has problems to cope with it. So I thought I should apply a digital filter to reduce the noise before the PID. I have previously used the Butterworth low pass filter in MATLAB with success so I thought I should stick with it. The question is, that as far as I understand the filter should have a few data points in order to be applied so this means that I should pass chunks of data to it.
Since my sampling is at 25ms then I presume that filtering every 250 ms (i.e. 10 data points) should be OK...??? Does this then automatically mean that my PID should also be performed every 250ms, right after the filter?? Is this right?
Is there any other way to go round filtering the data at a higher rate? (e.g. appending the last data point sampled at 25ms to the end of a say 10-row array and then filter every 25ms??)
Hope I made my problem clear, basically I am looking for some filtering strategy advice here..
Thanks for your time,
Harry
10-05-2010 03:52 AM
Take a look at the point by point filter
Owning Palette: Filters PtByPt VIs
Requires: Full Development System
Generates a digital Butterworth filter by calling the Butterworth Coefficients VI.
This VI is similar to the Butterworth Filter VI.
On the first run the filtercoefficients are calculated and the filter buffer (according to filter order) is initialized (set to zero).
You feed your value every 25ms and get the filtered value out. You can open this vi and take a look inside to see how it run.