Theoretical foundation for CMA-ES from information geometric perspective

06/04/2012
by   Youhei Akimoto, et al.
0

This paper explores the theoretical basis of the covariance matrix adaptation evolution strategy (CMA-ES) from the information geometry viewpoint. To establish a theoretical foundation for the CMA-ES, we focus on a geometric structure of a Riemannian manifold of probability distributions equipped with the Fisher metric. We define a function on the manifold which is the expectation of fitness over the sampling distribution, and regard the goal of update of the parameters of sampling distribution in the CMA-ES as maximization of the expected fitness. We investigate the steepest ascent learning for the expected fitness maximization, where the steepest ascent direction is given by the natural gradient, which is the product of the inverse of the Fisher information matrix and the conventional gradient of the function. Our first result is that we can obtain under some types of parameterization of multivariate normal distribution the natural gradient of the expected fitness without the need for inversion of the Fisher information matrix. We find that the update of the distribution parameters in the CMA-ES is the same as natural gradient learning for expected fitness maximization. Our second result is that we derive the range of learning rates such that a step in the direction of the exact natural gradient improves the parameters in the expected fitness. We see from the close relation between the CMA-ES and natural gradient learning that the default setting of learning rates in the CMA-ES seems suitable in terms of monotone improvement in expected fitness. Then, we discuss the relation to the expectation-maximization framework and provide an information geometric interpretation of the CMA-ES.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/16/2012

Objective Improvement in Information-Geometric Optimization

Information-Geometric Optimization (IGO) is a unified framework of stoch...
research
10/02/2022

Natural Gradient Ascent in Evolutionary Games

We study evolutionary games with a continuous trait space in which repli...
research
07/12/2021

The Fundamental Theorem of Natural Selection

Suppose we have n different types of self-replicating entity, with the p...
research
09/26/2012

Efficient Natural Evolution Strategies

Efficient Natural Evolution Strategies (eNES) is a novel alternative to ...
research
02/24/2019

A Formalization of The Natural Gradient Method for General Similarity Measures

In optimization, the natural gradient method is well-known for likelihoo...
research
05/23/2023

Optimal Preconditioning and Fisher Adaptive Langevin Sampling

We define an optimal preconditioning for the Langevin diffusion by analy...
research
09/06/2019

Full Convergence of the Iterative Bayesian Update and Applications to Mechanisms for Privacy Protection

The iterative Bayesian update (IBU) and the matrix inversion (INV) are t...

Please sign up or login with your details

Forgot password? Click here to reset