I'm using the producer loop to handle UI events. One of theses events is to start streaming data to the consumer loops. In one of the consumer loops I want to apply an algorithim every 5 seconds to the data, so if I was streaming data at 50Hz i would apply the algorithim every 250 samples of data.
I'm trying to think of a suitable piece of logic to do this. I was thinking of using the consumer loop's iteration counter as I would know how long each iteration of the loop takes and therefore could apply the algorithim when the number of iterations reaches a certain value...
I'm just wondering has anyone any other, perhaps more intuitive way of going about this?
Solved! Go to Solution.
The iteration counter is probably a poor choice. When you get a faster computer next year the loop spins faster and your timing is off. Or you add some additional feature to the program and the loop runs slower.
How accurately do you need to measure the 5 seconds? Unless you need millisecond resolution or accuracy you can just us the Tick count or Time of Day functions. When you start your process, place the current time in a shift register. Each time you acquire some data, compare the time to the start time. When the difference is > 5 seconds, apply the algorithm and reset the start time.
Another way is to just count your data samples. When you have accumulated 250 samples, process them. If you always want to process batches of exactly 250 samples, this is the best way. If you have 270 samples, take the 250 oldest ones for processing and keep the 20 newer ones as the first part of the next batch.
Ya the 5 seconds doesn't have to be that accurate, I'll look into using the time functions like you suggested.
I guess I'll be able to work a solution from there.
Thanks for the starting point.