From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

how to average n datapoints every 10 seconds

Solved!
Go to solution

Hello all, 

 

I am fairly new to programming and LabView. I am using the DAQ Assistant to read a continuous voltage at 1k samples and 1k Hz rate. What I will like to do is take an average every 10k samples (every 10 seconds), and record the average voltage value along with a time stamp. for the time stamp I do not need date and time, just seconds elapsed since the start of the run command. 

 

Can some one point me in the right direction as to how can I average the voltage readings from the DAQ every 10 seconds? 

 

0 Kudos
Message 1 of 5
(8,787 Views)

The easy way is to just set your sampling rate at 1K, your number of samples to 10K and then use the mean function in the math palette to average them.  Build an array of the elapsed time and your mean value and send it to 'Write to Spreadsheet File.vi' to log your data.  

 

 

For better timing characteristics a Producer/Consumer architecture is superior but the DAQ assistant and Write to Spreadsheet VI will have to go in favor of the more efficient DAQ and File I/O primitives.

 

 

LabVIEW Pro Dev & Measurement Studio Pro (VS Pro) 2019 - Unfortunately now moving back to C#, .NET, Python due to forced change to subscription model by NI. 8^{
0 Kudos
Message 2 of 5
(8,757 Views)

If I understand what you said, that would give me 10k samples every second, and the average. What I am after is to read 10 seconds of data @ 1k samples per second, take the average and record the value. so that every minute I have 6 voltage entries in the spreadsheet, for which every entry is the average of 10k points over 10 second interval.

 

Currently I am using the Statistics tools to average the Arithmetic Mean, but I get all 1k samples every second in the spreadsheet. The spreadsheet grows in size very fast and I need to monitor the voltages for 24 hour periods.

 

how can I store the values coming from the DAQ for 10 seconds and then pass these values to the average tool?

 

Can you point me to the right VI or a sample VI that I can learn from?

 

 

0 Kudos
Message 3 of 5
(8,737 Views)
Solution
Accepted by topic author mecha

No, think about it.  If you want to collect 10,000 samples and collect them at a speed of 1,000 samples/sec, it will take 10 seconds to collect them all.

 

Take a look at the attached VI.  It's a bastardization of one of the DAQmx examples.  It should show you the way to modify your code to get good results.

Keep in mind you don't really have to collect this many sample to average, or sample this slowly.  This is just to show you the idea. 

 

There are far better ways to accomplish your task but they require more advanced programming techniques.  I hope this helps get you started.

 

BTW, I left the TDMS VIs in there to show you another way to store data. 

 

Good Luck! 

 

 

LabVIEW Pro Dev & Measurement Studio Pro (VS Pro) 2019 - Unfortunately now moving back to C#, .NET, Python due to forced change to subscription model by NI. 8^{
Message 4 of 5
(8,731 Views)
Thank you, that file gave me what I needed. I was using a Statistics tool that was the problem maker, I used the Average Mean VI and my code worked great. Thanks
0 Kudos
Message 5 of 5
(8,698 Views)