**The CMA Evolution Strategy for Noisy and Global Optimization: Implementations in MATLAB**

The CMA-ES (Evolution Strategy with Covariance Matrix Adaptation) is an evolutionary (search) algorithm for difficult optimization problems. The CMA-ES is typically applied to unconstrained or bounded constraint continuous optimization problems, and search space dimensions between three and a hundred. The method should be applied, if derivative based methods, e.g. quasi-Newton BFGS or conjugate gradient, (supposably) fail due to a rugged search landscape (e.g. noise, local optima, outliers, etc.). If second order derivative based methods are successful, they are usually considerably faster than the CMA-ES.

Similar to quasi-Newton methods the CMA-ES estimates the inverse Hessian matrix (here: the covariance matrix) within an iterative procedure. In contrast to quasi-Newton methods the CMA-ES does neither compute nor use gradients. The former makes the method feasible on non-separable and/or badly conditioned problems. The latter makes the method feasible on multimodal and/or noisy problems.

The CMA-ES has several invariance properties. Two of them are (i) invariance against order preserving (i.e. strictly monotonic) transformations of the objective function value, and (ii) invariance against angle preserving transformations of the search space (including rotation, reflection, and translation), if the initial search point is transformed accordingly. Invariances are highly desirable, because they imply uniform behavior on classes of functions and therefore imply generalizability of empirical results.

Provided are two MATLAB implementations of the CMA-ES, supporting large population sizes by the so-called rank-µ-update (Hansen et al., 2003).

Similar to quasi-Newton methods the CMA-ES estimates the inverse Hessian matrix (here: the covariance matrix) within an iterative procedure. In contrast to quasi-Newton methods the CMA-ES does neither compute nor use gradients. The former makes the method feasible on non-separable and/or badly conditioned problems. The latter makes the method feasible on multimodal and/or noisy problems.

The CMA-ES has several invariance properties. Two of them are (i) invariance against order preserving (i.e. strictly monotonic) transformations of the objective function value, and (ii) invariance against angle preserving transformations of the search space (including rotation, reflection, and translation), if the initial search point is transformed accordingly. Invariances are highly desirable, because they imply uniform behavior on classes of functions and therefore imply generalizability of empirical results.

Provided are two MATLAB implementations of the CMA-ES, supporting large population sizes by the so-called rank-µ-update (Hansen et al., 2003).

- purecmaes.m: A short and simple one-file-implementation for reading the code to understand the algorithm. This is basically 80 lines of code, where 20 lines are of algorithmical interest. This implementation produces only rudimental output. For running experiments use:
- cmaes.m: A complete, one-file-implementation (revised in 2007). This version provides an interface to set options and can handle coordinatewise boundaries.

**Links**

- References to publications on the algorithm
- References to applications using CMA-ES (pdf)