Convergence Analysis of the Hessian Estimation Evolution Strategy

09/06/2020
by   Tobias Glasmachers, et al.
0

The class of algorithms called Hessian Estimation Evolution Strategies (HE-ESs) update the covariance matrix of their sampling distribution by directly estimating the curvature of the objective function. The approach is practically efficient, as attested by respectable performance on the BBOB testbed, even on rather irregular functions. In this paper we formally prove two strong guarantees for the (1+4)-HE-ES, a minimal elitist member of the family: stability of the covariance matrix update, and as a consequence, linear convergence on all convex quadratic problems at a rate that is independent of the problem instance.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/30/2020

The Hessian Estimation Evolution Strategy

We present a novel black box optimization algorithm called Hessian Estim...
research
08/01/2023

Mirror Natural Evolution Strategies

The zeroth-order optimization has been widely used in machine learning a...
research
04/18/2012

Analysis of a Natural Gradient Algorithm on Monotonic Convex-Quadratic-Composite Functions

In this paper we investigate the convergence properties of a variant of ...
research
06/10/2018

On the Covariance-Hessian Relation in Evolution Strategies

We consider Evolution Strategies operating only with isotropic Gaussian ...
research
03/18/2021

Hessian Initialization Strategies for L-BFGS Solving Non-linear Inverse Problems

L-BFGS is the state-of-the-art optimization method for many large scale ...
research
10/11/2017

A Simple Yet Efficient Rank One Update for Covariance Matrix Adaptation

In this paper, we propose an efficient approximated rank one update for ...
research
05/20/2021

EiGLasso for Scalable Sparse Kronecker-Sum Inverse Covariance Estimation

In many real-world problems, complex dependencies are present both among...

Please sign up or login with your details

Forgot password? Click here to reset