07-16-2015 07:48 AM
I'm using a least square fit (General Linear Fit VI) on my data & would like to provide the results with the SE of the estimated coefficients.
As I read somewhere, the covariance Output needs to be scaled by SSE/DOF, where SSE is the sum of squared error and DOF the degrees of freedom. Now, if I take the residue value, which should be the MSE (mean squared error according to the help pages) of my fit to calculate the SSE, I get a variance-covariance matrix that is exactly half of the expected one (I compared my results to results obtained with the software R)...
So, if I use SSE = 2*MSE*N, I get the correct results...
I've attached a small example to illustrate my problem (be generous, it is my very first VI), where you can see that the calculated MSE by hand is double the output of the residue value...
Am I doing something wrong?
07-16-2015 12:53 PM - edited 07-16-2015 12:58 PM
The way you are inserting your arrays makes no sense. All your arrays are actually lenght 20 after the initial prep step, while you are expecting lenght 10. Once you properly use the inputs, the results agree. (The stock VI correctly uses the size=20 while you are only dividing by 10).
Remember, "Insert into array" changes the array size, making room for the new data while shifting the existing (higher) data up. Maybe you were looking for "replace array subset" instead? Still, your array prep smells funny. Makes no sense!
07-17-2015 01:59 PM
Thanks for your reply! What a stupid error of me...
Cheers, Chris