08-02-2010 10:36 AM
We are working with two separate computers, one running LabView 7.1 and on running LabView 2009. We have some Raman spectroscopy data that we're removing a background from using a polynomial fit through a subset of the spectrum not containing any important spectral content. The curvefit was giving strange results on the LabView 2009 computer, so I tried to make a test function for it. It turns out that the results are quite different between the two systems (attached VIs for each version). The 2009 system returns insane results for anything at or above a 5th order curve fit (sample screenshot attached). It does this even when trying to fit an actual 5th order polynomial. The 7.1 system seems to have the expected results for all these curves regardless of the algorithm or (within reason) order I choose.
From reading other posts on this topic on the forum, I realize now that the data is possibly poorly scaled and that can lead to potential fitting problems with certain algorithms. Switching algorithms from SVD does solve the problem and we're able to proceed with our work, so I don't really need a solution response to this post. The reason I'm posting anyway though is that I'm still confused (and curious) about why there's such a discrepancy between the two versions, when using allegedly the same algorithm. Is there some way that the newer function could be improved to have better automatic error checking to reject such clearly erroneous curvefits (i.e. ones that don't even intersect the data you're fitting)?
08-03-2010 05:12 PM
Thanks JRanalli
I will look into this and keep this forum posted with any updates
01-18-2013 08:07 AM
I also get the impression that the default SVD algorithm is unsable in LV 2012. My X is from 0 to 3647 and my Y is from 0 to 65535. With any other algorithm I get good results...