Lazy Parameter Tuning and Control: Choosing All Parameters Randomly From a Power-Law Distribution

04/14/2021
by   Denis Antipov, et al.
0

Most evolutionary algorithms have multiple parameters and their values drastically affect the performance. Due to the often complicated interplay of the parameters, setting these values right for a particular problem (parameter tuning) is a challenging task . This task becomes even more complicated when the optimal parameter values change significantly during the run of the algorithm since then a dynamic parameter choice (parameter control) is necessary. In this work, we propose a lazy but effective solution, namely choosing all parameter values (where this makes sense) in each iteration randomly from a suitably scaled power-law distribution. To demonstrate the effectiveness of this approach, we perform runtime analyses of the (1+(λ,λ)) genetic algorithm with all three parameters chosen in this manner. We show this algorithm on the one hand can imitate simple hill-climbers like the (1+1) EA, giving the same asymptotic runtime on problems like OneMax, LeadingOnes, or Minimum Spanning Tree. On the other hand, this algorithm is also very efficient on jump functions, where the best static parameters are very different from those necessary to optimize simple problems. We prove a performance guarantee that is comparable, sometimes even better, than the best performance known for static parameters. We complement our theoretical results with a rigorous empirical study confirming what the asymptotic runtime results suggest.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/05/2020

Runtime Analysis of a Heavy-Tailed (1+(λ,λ)) Genetic Algorithm on Jump Functions

It was recently observed that the (1+(λ,λ)) genetic algorithm can compar...
research
02/09/2021

Optimal Static Mutation Strength Distributions for the (1+λ) Evolutionary Algorithm on OneMax

Most evolutionary algorithms have parameters, which allow a great flexib...
research
04/27/2020

MATE: A Model-based Algorithm Tuning Engine

In this paper, we introduce a Model-based Algorithm Turning Engine, name...
research
06/19/2020

Hybridizing the 1/5-th Success Rule with Q-Learning for Controlling the Mutation Rate of an Evolutionary Algorithm

It is well known that evolutionary algorithms (EAs) achieve peak perform...
research
02/07/2022

Theory-inspired Parameter Control Benchmarks for Dynamic Algorithm Configuration

It has long been observed that the performance of evolutionary algorithm...
research
04/21/2023

Tree-structured Parzen estimator: Understanding its algorithm components and their roles for better empirical performance

Recent advances in many domains require more and more complicated experi...
research
02/23/2023

Using Automated Algorithm Configuration for Parameter Control

Dynamic Algorithm Configuration (DAC) tackles the question of how to aut...

Please sign up or login with your details

Forgot password? Click here to reset