Optimal Parameter Choices Through Self-Adjustment: Applying the 1/5-th Rule in Discrete Settings

04/13/2015
by   Benjamin Doerr, et al.
0

While evolutionary algorithms are known to be very successful for a broad range of applications, the algorithm designer is often left with many algorithmic choices, for example, the size of the population, the mutation rates, and the crossover rates of the algorithm. These parameters are known to have a crucial influence on the optimization time, and thus need to be chosen carefully, a task that often requires substantial efforts. Moreover, the optimal parameters can change during the optimization process. It is therefore of great interest to design mechanisms that dynamically choose best-possible parameters. An example for such an update mechanism is the one-fifth success rule for step-size adaption in evolutionary strategies. While in continuous domains this principle is well understood also from a mathematical point of view, no comparable theory is available for problems in discrete domains. In this work we show that the one-fifth success rule can be effective also in discrete settings. We regard the (1+(λ,λ)) GA proposed in [Doerr/Doerr/Ebel: From black-box complexity to designing new genetic algorithms, TCS 2015]. We prove that if its population size is chosen according to the one-fifth success rule then the expected optimization time on OneMax is linear. This is better than what any static population size λ can achieve and is asymptotically optimal also among all adaptive parameter choices.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/07/2019

Self-Adjusting Mutation Rates with Provably Optimal Success Rules

The one-fifth success rule is one of the best-known and most widely acce...
research
04/15/2019

The 1/5-th Rule with Rollbacks: On Self-Adjustment of the Population Size in the (1+(λ,λ)) GA

Self-adjustment of parameters can significantly improve the performance ...
research
04/12/2021

Self-Adjusting Population Sizes for Non-Elitist Evolutionary Algorithms: Why Success Rates Matter

Recent theoretical studies have shown that self-adjusting mechanisms can...
research
06/19/2020

Hybridizing the 1/5-th Success Rule with Q-Learning for Controlling the Mutation Rate of an Evolutionary Algorithm

It is well known that evolutionary algorithms (EAs) achieve peak perform...
research
04/09/2019

Hyper-Parameter Tuning for the (1+(λ,λ)) GA

It is known that the (1+(λ,λ)) Genetic Algorithm (GA) with self-adjustin...
research
02/15/2004

Parameter-less hierarchical BOA

The parameter-less hierarchical Bayesian optimization algorithm (hBOA) e...

Please sign up or login with your details

Forgot password? Click here to reset