A Principle for Global Optimization with Gradients

08/18/2023
by   Nils Müller, et al.
0

This work demonstrates the utility of gradients for the global optimization of certain differentiable functions with many suboptimal local minima. To this end, a principle for generating search directions from non-local quadratic approximants based on gradients of the objective function is analyzed. Experiments measure the quality of non-local search directions as well as the performance of a proposed simplistic algorithm, of the covariance matrix adaptation evolution strategy (CMA-ES), and of a randomly reinitialized Broyden-Fletcher-Goldfarb-Shanno (BFGS) method.

READ FULL TEXT

page 10

page 12

page 14

research
12/01/1997

When Gravity Fails: Local Search Topology

Local search algorithms for combinatorial search problems frequently enc...
research
02/07/2020

A Scalable Evolution Strategy with Directional Gaussian Smoothing for Blackbox Optimization

We developed a new scalable evolution strategy with directional Gaussian...
research
05/22/2022

Covariance Matrix Adaptation MAP-Annealing

Single-objective optimization algorithms search for the single highest-q...
research
03/30/2020

The Hessian Estimation Evolution Strategy

We present a novel black box optimization algorithm called Hessian Estim...
research
04/28/2021

Generalised Pattern Search Based on Covariance Matrix Diagonalisation

Pattern Search is a family of gradient-free direct search methods for nu...
research
03/16/2019

On-line Search History-assisted Restart Strategy for Covariance Matrix Adaptation Evolution Strategy

Restart strategy helps the covariance matrix adaptation evolution strate...
research
07/08/2020

Non-local modeling with asymptotic expansion homogenization of random materials

The aim of this study is to build a non-local homogenized model for thre...

Please sign up or login with your details

Forgot password? Click here to reset