ni.com is currently undergoing scheduled maintenance.

Some services may be unavailable at this time. Please contact us for help or try again later.

LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

reduce sampling time output

Hi all,

 

So let me preface this by saying that I am a novice at LabView. I have some experience at C and Matlab, but very little with SIMULINK and none with LabView. 

 

My problem is that I have a few sensors that will be running for 24 hours, this would output nearly a million data points. What I would like to do is to have the code run such that every 6 seconds (60 times less often than the 10Hz sampling it is currently doing). Ideally, it would average the previous 60 points to produce 1 point. 

 

I have tried a few things already, but have basically 0 understanding of how they work and why they are not working correctly.

I tried: 

Sample Compression, this actually worked fairly well. It averaged my 60 points and out put it. However, I got the exact same number of data points as I did without using it. And, the time gave a weird reading (3.2*10^9) rather than the typical seconds that counted up.

 

Mean: Worked similarly to sample compression, except the averages at the start were low since it called upon values when t<0, and those were all 0. But after 6 seconds it worked similarly. And the sizing remained the same

 

1D Decimation after the mean, this did not actually output anything. Which is weird cause I thought it would have.

 

If someone would be able to provide me with an explanation of how I should go about doing this, that would be really helpful. I have this feeling that this question should be easy, but I have no idea how to trouble shoot these ideas.

 

Thanks for your time,

tgm 

0 Kudos
Message 1 of 6
(3,566 Views)
One obvious and simple way is to just request 6 seconds worth of data with each read. It's basic arithmetic where number of samples/samples per second = acquisition time. The loop is a bit unresponsive so you might want to consider getting 1 second at a time and just append the new array to the previous. The mean function will work just fine.
Message 2 of 6
(3,559 Views)

Hey Dennis!

Thank you very much for your help and time. And this going to sound really bad, but I don't even know how to go about doing that. As well as, we are printing to scopes before we output to LVM, so it would be beneficial to still read/sample at this rate. Thanks

 

-tgm

0 Kudos
Message 3 of 6
(3,545 Views)
Are you using the DAQ Assistant or the DAQmx functions? Have you configured the acquisition in MAX or in your code?

I don't understand what you mean by 'printing' to scopes.
Message 4 of 6
(3,528 Views)

DAQ Assistant, which then splits into 10 different signals. These 10 signals are displayed on to charts in real time. As for your second question, I don't know. It was setup by someone who knew more than I do. 

I attached what I am working with if that helps, there are a few things doing nothing cause I was just seeing if I could get them to work.

Thanks again for your time,

 

-tgm

0 Kudos
Message 5 of 6
(3,520 Views)
Sorry, I'm posting from my phone and can't look at the VI. Attach an image of the DAQ Assistant setup and your block diagram, please.
Message 6 of 6
(3,502 Views)