Objective Improvement in Information-Geometric Optimization

11/16/2012
by   Youhei Akimoto, et al.
0

Information-Geometric Optimization (IGO) is a unified framework of stochastic algorithms for optimization problems. Given a family of probability distributions, IGO turns the original optimization problem into a new maximization problem on the parameter space of the probability distributions. IGO updates the parameter of the probability distribution along the natural gradient, taken with respect to the Fisher metric on the parameter manifold, aiming at maximizing an adaptive transform of the objective function. IGO recovers several known algorithms as particular instances: for the family of Bernoulli distributions IGO recovers PBIL, for the family of Gaussian distributions the pure rank-mu CMA-ES update is recovered, and for exponential families in expectation parametrization the cross-entropy/ML method is recovered. This article provides a theoretical justification for the IGO framework, by proving that any step size not greater than 1 guarantees monotone improvement over the course of optimization, in terms of q-quantile values of the objective function f. The range of admissible step sizes is independent of f and its domain. We extend the result to cover the case of different step sizes for blocks of the parameters in the IGO algorithm. Moreover, we prove that expected fitness improves over time when fitness-proportional selection is applied, in which case the RPP algorithm is recovered.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/04/2012

Theoretical foundation for CMA-ES from information geometric perspective

This paper explores the theoretical basis of the covariance matrix adapt...
research
06/21/2012

Convergence of the Continuous Time Trajectories of Isotropic Evolution Strategies on Monotonic C^2-composite Functions

The Information-Geometric Optimization (IGO) has been introduced as a un...
research
02/24/2019

A Formalization of The Natural Gradient Method for General Similarity Measures

In optimization, the natural gradient method is well-known for likelihoo...
research
02/24/2022

Entropic trust region for densest crystallographic symmetry group packings

Molecular crystal structure prediction (CSP) seeks the most stable perio...
research
04/06/2022

Monotone Improvement of Information-Geometric Optimization Algorithms with a Surrogate Function

A surrogate function is often employed to reduce the number of objective...
research
02/14/2011

Chernoff information of exponential families

Chernoff information upper bounds the probability of error of the optima...
research
07/12/2021

The Fundamental Theorem of Natural Selection

Suppose we have n different types of self-replicating entity, with the p...

Please sign up or login with your details

Forgot password? Click here to reset