LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Averaging to Smooth Continuous Data

I'm sampling two sources of continuous data and one of the sources is quit noisy.  I've attached a screen shot of both signals plotted on a graph and the VI I have so far.  The white trace almost entirely covers the red trace due to noise.  Right now I'm sampling at 1000Hz. What I want to do is increase the sampling rate to 10000Hz and then average every 10 samples together.   The y-component of the waveform is fed into the a while loop that contains the Mean PtByPt VI. I have the sample length set to 10 for the mean vi. After the averaging is done I build the waveform again, but instead multiple the dt by 10 to get the new averaged data spaced in time correctly.  What I don't understand is what to set loop count to on the for loop. I could wire up an Array Size VI between the y-component of the get waveform components and the loop count control.  In that case, how would the for loop operate for 1 iteration of the outer while loop?  Say there were 1000 samples read into the while loop by the DAQmx read VI.  Would the for loop iterate 1000 times and the Mean VI produce 100 averaged samples?  

Download All
0 Kudos
Message 1 of 5
(2,438 Views)

Don't.  The For Loop has the array tunnel set as Auto-Indexing, which means it runs once for each element of the array.

 

I would recommend looking at the online LabVIEW tutorials
LabVIEW Introduction Course - Three Hours
Learn LabVIEW

0 Kudos
Message 2 of 5
(2,416 Views)

I gather you have not yet had a "Signals" course (I'm assuming, without data, that you are, or once were, an Engineering Student).  You seem to have data with a low frequency "Signal" and much higher frequency "Noise".  The higher your sampling frequency, the higher your ability to resolve high frequency components of both your Signal and your Noise.  Truly "random" Noise has all frequencies in it, so it is often removed by some form of low-pass filtering.  One of the simplest Low Pass Filters is to simply decrease the Sampling Rate.

 

What I learned (as a non-Engineer) is a good "rule of thumb" is to sample at 10 times the highest frequency you expect in your signal.  It looks like (from your picture) that your signal has a period of 10000 sample points -- if I assume that this is data from a 10 kHz sampling rate, this means that the signal is at 1 Hz, so if you decrease the Sampling Rate to 10 Hz, you will (a) get a good representation of the signal (10 points makes a pretty nice sinusoid) and (b) eliminate much of the "noise".

 

Bob Schor

0 Kudos
Message 3 of 5
(2,334 Views)

I think Bob might have forgotten his Wheaties this morning...

 

Lowering the sample rate won't remove the noise.  You *are* better off oversampling and then filtering (where averaging is one simple variety of filter).

 

Suggestions (in priority order):

1. Work to reduce the noise at its source.  Sometimes this is easy, sometimes it doesn't yield to reasonable efforts.

2. Add an analog lowpass anti-aliasing filter between the signal and your DAQ device.  The cutoff freq should be no higher than 1/2 the sample rate! 

3. Oversample (like you're doing), and apply a digital lowpass filter to the data, then

4. Downsample by averaging.  The digital filter will really cut out high freq content and noise.  Averaging will help reduce any remaining low freq "random" noise.

 

 

-Kevin P

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
0 Kudos
Message 4 of 5
(2,306 Views)

@Kevin_Price wrote:

I think Bob might have forgotten his Wheaties this morning...


Yup, sorry for that.

 

BS

0 Kudos
Message 5 of 5
(2,282 Views)