02-06-2007 09:59 PM
02-07-2007 12:04 AM - edited 02-07-2007 12:04 AM
OK, I don't quite understand your code. Should the output be a single number, the average of all 2000 points? Do you want the average of each row or column, forming another 1D array?
Since your question is not very clear, we can only guess what you actually want. The code in the image creates an array with 100 rows and each row consists of 20 elements: the original dataset with added noise. Then we take the average of all elements in the 2D array (sum all elements and divide by the number of elements). Most likely, you want a slightly different answer, so modify as needed or clarify your needs if you get stuck. 🙂
Message Edited by altenbach on 02-06-2007 10:05 PM
02-07-2007 01:40 AM
thank you for your reply.
I am sorry for my unclear expression.
I have an 1*20 array, like
A=[2.8600 2.8626 2.8653 2.8679 2.8705 …]
after add gaussian noise, the array become
A1=[2.8606 2.8604 2.8608 2.8662 2.8646…]
A2=[2.8603 2.8644 2.8637 2.8693 2.8698…]
A3=[2.8604 2.8654 2.8632 2.8678 2.8632…]
…
one hundred or more arrays. I want to calculate the mean of the array Ave=(A1+A2+…+A100)/100 .
I would like to test whether the difference between A and Ave will deduce with the average time raise.
please give me some more help.
thanks