Optimising Optimisers with Push GP

10/02/2019
by   Michael Lones, et al.
12

This work uses Push GP to automatically design both local and population-based optimisers for continuous-valued problems. The optimisers are trained on a single function optimisation landscape, using random transformations to discourage overfitting. They are then tested for generality on larger versions of the same problem, and on other continuous-valued problems. In most cases, the optimisers generalise well to the larger problems. Surprisingly, some of them also generalise very well to previously unseen problems, outperforming existing general purpose optimisers such as CMA-ES. Analysis of the behaviour of the evolved optimisers indicates a range of interesting optimisation strategies that are not found within conventional optimisers, suggesting that this approach could be useful for discovering novel and effective forms of optimisation in an automated manner.

READ FULL TEXT

page 7

page 11

page 12

page 14

research
03/22/2021

Evolving Continuous Optimisers from Scratch

This work uses genetic programming to explore the space of continuous op...
research
05/24/2019

Instruction-Level Design of Local Optimisers using Push GP

This work uses genetic programming to explore the design space of local ...
research
07/28/2021

Automated Design of Heuristics for the Container Relocation Problem

The container relocation problem is a challenging combinatorial optimisa...
research
06/21/2019

Sparse Spectrum Gaussian Process for Bayesian Optimisation

We propose a novel sparse spectrum approximation of Gaussian process (GP...
research
02/05/2021

GIBBON: General-purpose Information-Based Bayesian OptimisatioN

This paper describes a general-purpose extension of max-value entropy se...
research
03/20/2016

Multi-fidelity Gaussian Process Bandit Optimisation

In many scientific and engineering applications, we are tasked with the ...

Please sign up or login with your details

Forgot password? Click here to reset