Topic Options

- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page

This is an open group. Sign in and click the "Join Group" button to become a group member and start posting.

Highlighted
##

** LabVIEW Champion. It all comes together in GCentral **

**What does "Engineering Redefined" mean??**

Downhill Simplex (Nelder-Mead)

08-20-2017 01:33 PM - edited 09-25-2017 01:08 PM

Options

- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report to a Moderator

This is the first public release of a 2006 effort to create a "drop-in" replacement for Nonlinear curve fit that uses Nelder-Mead instead of Levenberg-Marquardt.

The only advantage is the absence of numerical partial derivatives and it is thus useful in the rare cases where the partial derivatives are problematic (e.g. if the function model is microscopically rough due to e.g. the use of iterative procedures).

Disadvantages are typically much slower convergence compared to LM and lack of good intrinsic parallelization (All partial derivatives of LM can be calculated in parallel, for example!)

The algorithm is based on Downhill Simplex nD, and retains some of the shortcomings. For example the initial simplex is generated by a simple "+1", which might not be a reasonable scale. Of course the scaling will adapt to the problem after a few iterations unless we step off a cliff. 😄

To make it "pin compatible" with Nonlinear curve fit, I have added "weights" and the definition of termination conditions. An empty matrix is returned for the covariance.

While old, this code has not been used much and is thus not fully tested. Similar to Nonlinear Curve Fit, it can easily be used for multiple dimension in the independent variables. Many improvements are possible.

Examples:

You can use any of the examples for nonliear curve fit and replace the fitting VI with this version. Here are some observations using the front panel defaults:

**Ellipse fit**: works fine

**Sum of three Gaussians with offset**: Does not converge because (1) the initial guesses are too poor and (2) the initial simplex scaling (+1) is way too off. If works if better initial estimates are entered manually.

**Gaussian surface with Offset**: Works fine, but converges much more slowly as expected, of course.

(Attachment is in LabVIEW 2015)

*EDIT (9-25-2017). I also attached the same down-converted to LabVIEW 8.2. There is nothing fancy in the code, so it should convert nicely, but I have no way to test. Please report back if there are issues. You should be able to open this in any LabVIEW version version 8.2 and newer.*

12 REPLIES 12

Highlighted
##

Options

** LabVIEW Champion. It all comes together in GCentral **

**What does "Engineering Redefined" mean??**

Re: Downhill Simplex (Nelder-Mead)

08-23-2017 08:56 AM

- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report to a Moderator

An example where Levenberg-Marquardt fails due to problems with the partial derivatives can be found here. In this case, Nelder-Mead continues to work fine.

Highlighted
##

Options

Re: Downhill Simplex (Nelder-Mead)

08-28-2017 02:53 AM

- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report to a Moderator

Hi everyone,

after testing altenbach's Nelder-Mead VI with several problems of different complexity I have found it to be very useful and reliable.

However, if there are several solutions (i.e. two or more sets of parameters result in the same functions, the parameters are therefore ambiguous) the algorithms sometimes gives unrealistic parameters as a solution.

Therefore it would be useful to give the VI some constraints as maximum/minimum value of the parameters in order to avoid unrealistic solutions.

Does anyone have an idea about how to implement that in Nelder-Mead algorithm?

Best regards,

Hobgoblin

Highlighted
##

Options

Re: Downhill Simplex (Nelder-Mead)

08-28-2017 08:04 AM

- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report to a Moderator

Yes. The central step in the NM Algorithm is to compute the value of your Objective Function at each point of the Simplex. If you want to impose a constraint (such as Parameter X lies in [a, b]), then if the Simplex point violates the constraint, you multiply the Objective Function by 100, basically saying "This is __not__ a good point", and the Simplex will "recoil" from this point.

You can find this technique (and the Method, itself) described by doing a Web search. I've incorporated this into my coding of the Algorithm.

Bob Schor

Highlighted
##

Options

Re: Downhill Simplex (Nelder-Mead)

08-28-2017 09:06 AM

- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report to a Moderator

Highlighted
##

Options

Re: Downhill Simplex (Nelder-Mead)

08-28-2017 09:40 AM

- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report to a Moderator

BlessedK,

You would really benefit, I think, in a course on Numerical Methods. There you would learn about coming up with __numerical__ solutions (meaning "put an actual number to it, not a symbolic representation like "pi") to problems, and would understand what it means to __approximate__ an answer. For example, how do you represent "pi", or how do you compute square root of 2? Neither can be exactly expressed as a Dbl ...

Minimization problems involve "finding the minimum in some N-Dimensional Space". There are various methods, including "Brute Force" (try "all the numbers in the Space and choose the one giving the minimum"), which has obvious problems.

There are two common sources of "variability" (or "error") that often arise. One is that the Space may have multiple "dimples", local minima isolated from each other, so if you start with a poorly-chosen Initial Guess, you may find a minimum, but the "wrong" one. Here, a good Initial Guess gives you a better chance of converging to the "minimum you want" (notice I didn't say the "right" minimum ...).

The other problem is that the space may become "flat" near the minimum -- if you move in any direction a small amount, the value of the Objective Function might not change significantly (meaning when you do the computation, you get essentially the same value). If you are using a Gradient Descent (Levenburg-Marquardt) method, this is equivalent to having a Gradient near (or at) zero. With the Simplex Method, this is equivalent to having all the points of the Simplex have the same "value", in which case the usual step is to use the centroid of the Simplex.

All of which comes down to "accepting the Approximation", and perhaps putting some tests in your code to detect some of these situations.

Bob Schor

Highlighted
##

Options

Re: Downhill Simplex (Nelder-Mead)

08-28-2017 10:04 AM

- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report to a Moderator

Thanks Bob schor,

lengthy response as usual plus asking to take a course on numerical methods 🙂 but I asked a simple question.

I understand all that you said.

My question is: is the levenberg M approach immune to the issue of non-unique solutions ?

Would using good initial guesses tend to keep the results within a realistic range or not. Whatever you consider "correct" or "right"

Is again not unique. You specify "a range" as the constraint.

Highlighted
##

Options

Re: Downhill Simplex (Nelder-Mead)

08-28-2017 10:23 AM

- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report to a Moderator

Sorry, BlessedK -- I'll try to restrain my teaching tendencies.

is the levenberg M approach immune to the issue of non-unique solutions ? **No.**

Would using good initial guesses tend to keep the results within a realistic range or not.

**Difficult to say, as "good initial guesses" is a tad ambiguous, as is "a realistic range". If the problem does have a Global Minimum, and your initial guess is closer to it than to other minima, you are "more likely" to find the Global Minimum.**

Bob Schor

Highlighted
##

Options

Re: Downhill Simplex (Nelder-Mead)

08-28-2017 11:16 AM - edited 08-28-2017 12:13 PM

- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report to a Moderator

Thanks. Greatly Appreciated.

Not saying you should restrict your teaching tendencies. Nothing bad with answering the questions directly first, then expand on the points. (Why/how so etc)

Those details are definitely very useful as are those from Altenbach too. I always appreciate both of your inputs in particular.

For some of us the focus is on the ultimate goal (the bigger picture) and these things only serve as a "tool". Perhaps of utmost importance to us is first recognizing the right tools we need. (Fit-for-purpose). We first focus on how to use the tool to achieve our immediate goals and then later gradually follow-up on other equally but not immediately important details.

Please keep up the details. I am mostly a self-taught person but I also know that teachers facilitate understanding. (Except if every book on earth were written by K.A Stroud)

Highlighted
##

Options

Re: Downhill Simplex (Nelder-Mead)

08-29-2017 03:06 AM

- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report to a Moderator

Thank you for your input, that approach seems very reasonable.

And on the problem of "several" solutions: I often have models, where it can be easily shown that there are several, sometimes even infinitely many, solutions, that lead to the exact same result. The algorithm (doesn't matter which one you use) of course can't distinguish between a physically meaningful and physically nonsense solution.

- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page