LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

One data point every 10 seconds. How can I smooth the data in "real time"?

Solved!
Go to solution

Hello,

I have a program that returns one data point every 10 seconds (particle size data). As the data points scatter quite a bit, I would like to perform a smoothing on the data. I used a pre-recorded data set to determine which smoothing algortihm works best (I used OriginPro 9 for that). The best results I get with a Savitzky Golay filter.

Is it possible to implement such a filter to take into account the last couple of values and return the smoothed value?

An do that repeatedly until the process is finished. I am afraid that this is not possible. Yet, I am not very experienced with LabVIEW and hope I am wrong 🙂

It would be great if someone of you could help me.

 

0 Kudos
Message 1 of 15
(4,214 Views)

Shift register with preallocated array and replace array subset method to put new value into the array.

As you most likely do not require a ring buffer (assumption: index position is important for you!), you can rotate the array and replace the most current value [N-1].

 

 

Norbert
----------------------------------------------------------------------------------------------------
CEO: What exactly is stopping us from doing this?
Expert: Geometry
Marketing Manager: Just ignore it.
Message 2 of 15
(4,198 Views)
Solution
Accepted by topic author Qbach

You even do not need to program this feature yourself. It is already implemented in LV (Signal Processing, Point By point, Other Functions):

 

dataqueue.png

 

 

 

Message 3 of 15
(4,176 Views)

Nice one, Blokk.... never looked into that subpalette 🙂

Norbert
----------------------------------------------------------------------------------------------------
CEO: What exactly is stopping us from doing this?
Expert: Geometry
Marketing Manager: Just ignore it.
0 Kudos
Message 4 of 15
(4,174 Views)

Thank you, Blokk and Norbert,

 

Blokk, that vi help me a lot! Thank you Smiley Very Happy

0 Kudos
Message 5 of 15
(4,154 Views)

You might also consider oversampling followed by filtering/averaging/fitting, and then discarding all but one result every 10 seconds.

 

For example: sample at 1 kHz and set up a loop to read 1000 samples at a time.  Every 10 iterations of the loop, retain the latter portion of that 1000 sample set, do your processing on it, and turn it into a single representive data point.   This is *much* more real-timeish than running your signal processing off data from 10, 20, 30+ seconds ago.

 

 

-Kevin P

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
Message 6 of 15
(4,134 Views)

It is a nice idea.

OP: do not forget that it is important to decouple your DAQ from your data processing part. A Producer/consumer design can give you this feature. Even better, in this way you send an array of doubles via the Queue to the consumer from the DAQ loop, so you do not need additional Queue/Shift register, etc...

0 Kudos
Message 7 of 15
(4,127 Views)

What exactly do you mean?

I get one data point every 10 seconds and inbetween nothing. I could probably read the same value for 10 seconds at a high sampling rate but I don't see the benefit of it.

 

0 Kudos
Message 8 of 15
(4,121 Views)

I am using a rather large QMH design. I should be covered there Smiley Happy

0 Kudos
Message 9 of 15
(4,119 Views)

Qbach wrote:

...I could probably read the same value for 10 seconds at a high sampling rate but I don't see the benefit of it.


Your question was about smoothing your data.  Since you aren't satisfied simply living with the raw reading you take 1 time every 10 seconds, I'm suggesting that the estimate you can make by oversampling and locally smoothing right near those 10 second marks will be an improvement.  Otherwise, the only data available to run a smoothing algorithm is 10, 20, 30+ seconds old.  I find it hard to believe that'd be *more* relevant.

 

The whole point is that you won't be reading the same value for 10 seconds at a high sampling rate.   Variations or noise in your process, your sensor, or your signal path will see to that.  If none of those things were changing, you wouldn't have asked about smoothing in the first place.

 

 

-Kevin P

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
0 Kudos
Message 10 of 15
(4,087 Views)