LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Moving Average of Valleys of Waveform Data

Solved!
Go to solution

My labview program collects waveform data in real time (in ddt form), and calculates a moving average in real time, using a mean point by point function in a for loop. The waveform data represents respiratory breathing, so the amplitude and period of the wave is constantly changing.

 

Problem: I would like to modify the program in order to calculate the moving average of the valleys. For the end result, I hope to have a smooth line with little to no sine wave. Do you have suggestions for how to do this? 

 

Some ideas I haven't tried are:

1. Use the LabView peak finder function to find the amplitude of the wave. Subtract half the amplitude from the current moving average. 

2. Offset the raw data by half the amplitude. Calculate the moving average of the data.

3. Is there any way to filter or transform the raw data to do this?  

 

I have already tried to use LabVIEW peak finder function to find the valleys, and average the valleys. This did not give me a smooth curve, because random valleys (once every 10 breaths or so) were higher than the rest. Also, it would occasionally miss a valley. 

 

Attached images: 

1) code that reads data in and builds graph. Data read in is ddt. 

2) code for calculating the moving average. This is in a separate loop from the main loop that collects the data. The moving average is also filtered, but please disregard that part.  

3) an example of what the data in the graph loops like. Please ignore the green line. That is a filter that I was playing with. 

 

 

 

 

 

 

0 Kudos
Message 1 of 11
(3,028 Views)

 

You need to define "to calculate the moving average of the valleys" first.

 

What exactly is a valley? Everything under some threshold? It's not trivial...

Message 2 of 11
(2,989 Views)

Thanks for the reply, wiebe@KARYA:

I'll define a valley as the region between where the main sinusoidal wave decreases and increases (ie, between where the slope of the graph is negative and where the slop is positive). There is one valley for every cycle/period of the sinusoidal wave. Based on the data that I have so far, I would define my threshold as the bottom 10% of each cycle/period of the sinusoidal wave.  

 

On the attached picture, I've circled in red 2 examples of valleys. The one on the left is a sharp point, and the one on the right is longer, with some higher frequency noise. I expect to see both forms in my data. 

0 Kudos
Message 3 of 11
(2,971 Views)

@al_p wrote:

Thanks for the reply, wiebe@KARYA:

I'll define a valley as the region between where the main sinusoidal wave decreases and increases (ie, between where the slope of the graph is negative and where the slop is positive). There is one valley for every cycle/period of the sinusoidal wave. Based on the data that I have so far, I would define my threshold as the bottom 10% of each cycle/period of the sinusoidal wave.  

 

On the attached picture, I've circled in red 2 examples of valleys. The one on the left is a sharp point, and the one on the right is longer, with some higher frequency noise. I expect to see both forms in my data. 


That makes clear what you want, but it's not enough for a computer. Looking at the scope for instance, will also trigger on noise. There will probably be small descending slopes in the up going ramps, and an algorithm will trigger on it.

 

A "threshold as the bottom 10% of each cycle/period" only shifts the problem to detecting cycles\periods. 10% is only a real value once the bottom is known.

 

Do you want to analyze during acquisition, e.g. real time on streaming data? E.g. continuously? Or do you have the relative luxury of analyzing after sampling all data? That would make life a lot easier...

0 Kudos
Message 4 of 11
(2,952 Views)

For this application, we don't have the luxury of analyzing after sampling all data. I would like to analyze in "real time" and have the resulting data streaming on a graph.  

0 Kudos
Message 5 of 11
(2,941 Views)

What about using Histogram PtByPt to select the bottom bin of the magnitude?

 

You could setup the bins to give you a number to read directly, or use it to select identify number of samples to average from the minimum side of a sorted array of data points.

Message 6 of 11
(2,927 Views)

Thank you, WhtHawk, that is a good idea. I have not tried it yet. 

 

I have got something very basic working (see attached code). This one just collects an array of flow and time data for every 4 seconds and finds the minimum using the LabVIEW min/max function. (see attached picture) I am not sure why it is graphing the random line in the beginning of the program, but the line across the minimum valleys is very close to what I was trying to achieve. An improvement would be making the red line more smooth. 

 

I think that the program is lagging a little, and I am going to try to eliminate the build array part, and just sort the data directly to find the minimum. 

 

If you know why I am getting a random line in the beginning, or have any other suggestions, I would appreciate it. 

Download All
0 Kudos
Message 7 of 11
(2,908 Views)

The short answer: that's not how you should program LabVIEW**.

 

The technical answer: your 'random line' is caused by the "Averaging While Loop" starts running before the top loop take a measurement. There is no reason for two loops. As points are added, the minimum can be evaluated. It the samples are equally spaces in time, use Array Min & Max PtByPt, set the number of samples to a 4 sec. equivalent.

 

**

Spoiler
Straighten all wires, remove all dead space, start using sub VI's, make wires actually come out of terminals, put nodes in left to right order, and so on. You got away with it (up to now) because your program is small.

The more you're program will grow, the more problems you will get. Just a friendly warning.
0 Kudos
Message 8 of 11
(2,898 Views)

Okay, I tried to clean up my program based on your feedback wiebe@CARYA. Now, everything is in one loop, and some of the bigger pieces of code are put into sub vis. 

 

I had to change the algorithm for finding the minimums to get it to work in the timing of the main while loop. Now, it sorts for the minimum in every iteration of the while loop. After 20 iterations, it outputs the minimum of those last 20 data points onto the graph. 

 

I'm still having the issue with the 'random line' being graphed in the beginning. I don't have the issue the first time that I open the program and run it. I suspect that the 'random line' has something to do with clearing out old data/ zeroing variables correctly. Or could it still be that the code for finding the minimum is running before data is collected? Do you have any specific suggestions for what to change or fix? 

 

Thanks for all the help, I really appreciate it. 

0 Kudos
Message 9 of 11
(2,885 Views)
Solution
Accepted by topic author al_p

Okay, I fixed the random line issue: it was as simple as zeroing the variable in the beginning. Here is a version with that fix. I also temporarily eliminated the part of the code that saves data to file. This was causing a major lag in my data acquisition system. 

0 Kudos
Message 10 of 11
(2,862 Views)