04-12-2018 06:48 AM
Hi,
I'm fairly new to LabVIEW, and struggling with implementing a self-coded lowpass filter in my code. It works fine for simulated sine waves (as can be seen below). However, when attempting to lowpass load cell data acquired from a DAQ assistant, something strange happens. The baseline of the filtered signal is offset from the unfiltered signal, and the offset increases with decreasing cut-off frequency. I've attached the VI for the lowpass filter. Any help or insight would be greatly appreciated!
Solved! Go to Solution.
04-13-2018 04:16 AM - edited 04-13-2018 04:18 AM
Hello mikarn91,
Welcome to the Forums! I made a test-vi to wrap around your VI, to make it easier to test it. I attached it in LabVIEW version 2016, you should be able to open it. (You probably have a similar file, feel free to attach it as well, makes the hurdle to start looking into your issue lower for others )
I think I found your mistake: Removing the Negate in your Filter it workes much better for me.
04-14-2018 04:25 AM
Wow, talk about being blind to my own code. This fixed everything. Thanks!
04-16-2018 01:50 AM