From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Filtered data time delay compensation

Solved!
Go to solution

Dear experts, 

 

Hi, I'm DMT.

 

I want to get velocity data and use it after filtering(Butterworth, Bandpass), but there is one problem...

After filtering, there is a tiem delay compared to the original data...

 

Matlab has a function to compensate for this delay. Is it possible with Labview?

 

I will attach a graph figure of the delayed data.

 

I would appreciate your reply.

Thank you.

aaa.PNG

0 Kudos
Message 1 of 7
(2,881 Views)

Hi DMT,

 

My first thought was that if the delay is constant, you can shift it fairly easily.

However, my second thought was: "is that a graph or a chart"?

 

If the cause of the "delay" is that you're acquiring data (the black input), calculating a value (the blue line?) and then filtering, and in a later iteration of a loop you get the results and plot them too (the red line), you'll instead want to store the "time" values that are used as an input and plot a graph rather than a chart.

 

In case you didn't know (sorry if this is obvious to you), a chart stores history for you, and doesn't have a concept of an "X" value really - it uses an array of equally-spaced data points, and often a waveform is a good model for this. A typical input might be a scalar value (e.g. DBL), if you only had one line.

 

A graph stores X-Y points (there are a few possible variations, e.g. cluster of two values, cluster of two arrays of values, but that's the basic idea). Each Y value is paired with an X value which determines the location of the point directly.

On a graph, you always write all of the data to plot at once, so you usually store history in a Shift Register or similar.

This freedom to control your X values specifically would allow you to align your data correctly, if you store the original "time" values with the data as it is processed.

 

Can you share your code so we can see if this is your problem?


GCentral
Message 2 of 7
(2,856 Views)

Hi cbutcher,

Thank you for your answer.

 

I've attached the code and raw data, you can load two files and run the code.(acc1, tacho2)

but what i want to know is just how to compensate for the time delay caused by the filter.

Time delay is a natural pheonomenon when filtering, and as I said in the first article, som programs (MATLAB, etc.) have a function to compensate it. Please refer to the link below for details.

https://kr.mathworks.com/help/signal/ug/compensate-for-the-delay-introduced-by-an-iir-filter.html 

 

If the time delay is constant, it is easy but depending on the filter frequency setting, the time delay increases or decrease. I would like to know how to correct this automatically.

 

 

0 Kudos
Message 3 of 7
(2,843 Views)
Solution
Accepted by topic author DNS_MTB

So I started cleaning up some minor sections of your code, but then I realised that the output looks probably closer to what you're looking for if you simply set the sampling frequency input to your filter to the reciprocal of the sampling rate.

cbutcher_0-1600231884821.png

 

I see that the sampling time (in the data file you attached - thank you for making it easy to test your VI!) is 0.0009765625, which gives a sampling frequency of 1024 (I guess this is set by your acquisition module). So you could just wire 1024 (instead of 3200) if this is constant, or use the Reciprocal node if it might change.

 

You can probably also get sampling time directly from your data if you want to avoid "magic numbers" on your block diagram.


GCentral
0 Kudos
Message 4 of 7
(2,836 Views)

Also, if you have access to this function, your "integration" loop at the top can be replaced by Integral x(t) VI. This requires the "Full" version of LabVIEW (contrast with Base and Professional, at either end).

 

I wired only the X and dt values and got a fairly similar result (it looks a little 'tidier' - I'm not sure exactly why but I guess you could dig into this if you wanted...)

cbutcher_0-1600232423597.png

 


GCentral
0 Kudos
Message 5 of 7
(2,828 Views)

Due to circumstances, I will continue to ask a new account. This article will be linked to the following link.

https://forums.ni.com/t5/LabVIEW/Filtered-data-time-delay-compensation/td-p/4083499?profile.language...

 

aaaa.PNG

 

I tried changing the sampling frequency as Mr.cbutcher said, but if I changed the cutoff frequency (30, 0.7, etc) there is a delay again. Depending on the situation, I need to change the cutoff frequency, so I want to compensate for the delay with this frequency.

I have a few questions.
1. Is it correct to put 1/sampling time in the sampling frequency?
2. Is it possible to automatically compensate for delay even if the filter cutoff frequency range is changed?
3. In the future, I need to perform this function by receiving N sample data in real time. Is it possible to integrate N sample data with Integral x(t) VI?(The raw data uploaded is one loop data)

That is all. I would appreciate your reply.

0 Kudos
Message 6 of 7
(2,809 Views)
Solution
Accepted by topic author DNS_MTB

Dear experts, 

 

Hi, I'm DMT.

 

I'm trying to filter a signal, but there is a problem with phase delay after the filter. (Butterworth filter, Bandpass)

So I added the following code to correct this delay.aa.PNG

 

 

By the way, after adding the code, the phase was corrected, but there was a slight difference in the value of the signal.

 

 aaa.PNG

 

High & low cutoff frequency is changed, the value difference also changes.

 

Are there any problems with the phase delay compensation code? If there is a problem with the code, please let me know how to fix it

I will attach the code for reference.

 

Thank you very much.

 

 

0 Kudos
Message 7 of 7
(2,725 Views)