Hi all,
I have a question regarding high-pass filtering in LabVIEW simply based on observation, having never been taught directly about filters.
Post high-pass filtering a signal, with an IIR Butterworth Filter (2nd order) for example, towards time = 0 there's a transition from a 'false' high magnitude to an expected level, as in the attached albeit this is obtained through a Fortran program, as I don't have anything else to hand at the moment. To minimise the significance, with respect to the rest of the filtered signal, you can increase the sampling time, change the filter settings or even add additional sampling time with the intention of truncating it.
Is there an established method of getting around this? The reason I ask is because I'm trying to accurately discriminate between laminar and turbulent portions of a velocity signal but towards time = 0, in a completely laminar signal, our method returns false positives for the presence of turbulent structures. I presume this will always be the case but it's better to ask first and conclude afterwards!
Thanks.
---------------------------------------------------
LabVIEW 8.5 User.