The Nelder-Mead algorithm is a rather popular algorithm for (low dimensional) nonlinear programming. The original version (from 1965, see [1]) has known and varied failure modes, but never really had to fear for its popularity. It doesn’t need derivatives, which can be quite convenient, and has a reputation to work well even with noisy and rough functions.

Recently, a few provably convergent variants have been devised (see [2], [3], and [4]). The one by A. Bürmen et al [4], the “Grid Restrained Nelder-Mead Algorithm” (GRNMA) is notable because it does not impose a “sufficient descent” condition at each iteration, which makes it closer to the original algorithm.

A reliable, simple-to-use derivative free minimization routine is a handy little tool to have around. It turns out that implementing the GRNMA is also a pleasant programming project.

Here you can download a rather faithful implementation of the GRNMA (in Common Lisp). I also included the classical Nelder Mead algorithm for completeness.

The performance of these improved methods (judging from the numerical experiments in [4]) seems to be about equal in terms of number of function evaluations. But as a side effect of the additional reliability, they tend to be more efficient than the original algorithm even when it does not fail.

The code is under the MIT licence, with no warranty of any sort, etc.

References:

[1] J.A. Nelder and R. Mead, “A simplex method for function

minimization,” The Computer Journal, vol. 7, pp. 308-313, 1965.

[2] P. Tseng, “Fortified-descent simplical search method: A general

approach,” SIAM Journal on Optimization, vol. 10, pp. 269-288,

1999.

[3] C.J. Price, I.D. Coope, and D. Byatt, “A convergent variant of the

Nelder-Mead algorithm,” Journal of Optimization Theory and

Applications, vol. 113, pp. 5-19, 2002.

[4] A. Bürmen, J. Puhan and T. Tuma, “Grid Restrained Nelder-Mead

Algorithm”, Computational Optimization and Applications, vol.

34, no. 3, pp. 359 – 375, 2006.

### Like this:

Like Loading...