I am generating a new 1000 point histogram with each iteration of a loop. I need to figure out the running average at each point in the histogram over 200 iterations without a 200 shift registers. There are a few posts in this direction, but I can't seem to get the indexing right. Can someone help me out? thanks. Jonathan
Hi Altenbach, even though I am not the one wiho asked i'd like to thank for the great example. Thats really helpful. One think I do not get is the sum, it seems you are adding over all rows and columns (i.e. all elements). How does the loop know just to sum up over one dimension of the 2-D array? Cheers Thomas
You are probably talking about the small FOR loop.
If you autoindex a 2D array, you always get a row (1D) array in each iteration, starting with the first row until you run out of rows. Since we actually want to sum each column, we transpose the 2D array first and everything falls into place. 🙂
Also the transposition is not needed, because you could e.g. add each of the histograms inside a shift register. There are many other ways to do this.
Note that with a bit more coding there are much smarter ways to do all that and in this case the task could be done with significanly less CPU effort. For example it is NOT necessary to add all spectra with each iteration, it is sufficient to add the newest and subtract the oldest trace from the sum of all histograms, which is probably 100x less work in this particular case. 🙂
Note that with a bit more coding there are much smarter ways to do all that ...
Well, here it does not really matter, because both complete in well under 1ms, but here's a quick implementation of the above idea. Could be useful with larger datasets. We keep the sum of all rows in a second shift register and at each iteration we read the current histogram at that location (the oldest in the history!) before we overwrite it with the newest. Now we add the newest and subtract the oldest from the sum of all histograms and divide by the total number of histograms currently in it.