Hard Problems are Easier for Success-based Parameter Control

Recent works showed that simple success-based rules for self-adjusting parameters in evolutionary algorithms (EAs) can match or outperform the best fixed parameters on discrete problems. Non-elitism in a (1,λ) EA combined with a self-adjusting offspring population size λ outperforms common EAs on the multimodal Cliff problem. However, it was shown that this only holds if the success rate s that governs self-adjustment is small enough. Otherwise, even on OneMax, the self-adjusting (1,λ) EA stagnates on an easy slope, where frequent successes drive down the offspring population size. We show that self-adjustment works as intended in the absence of easy slopes. We define everywhere hard functions, for which successes are never easy to find and show that the self-adjusting (1,λ) EA is robust with respect to the choice of success rates s. We give a general fitness-level upper bound on the number of evaluations and show that the expected number of generations is at most O(d + log(1/p_min)) where d is the number of non-optimal fitness values and p_min is the smallest probability of finding an improvement from a non-optimal search point. We discuss implications for the everywhere hard function LeadingOnes and a new class OneMaxBlocks of everywhere hard functions with tunable difficulty.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/12/2021

Self-Adjusting Population Sizes for Non-Elitist Evolutionary Algorithms: Why Success Rates Matter

Recent theoretical studies have shown that self-adjusting mechanisms can...
research
04/14/2022

OneMax is not the Easiest Function for Fitness Improvements

We study the (1:s+1) success rule for controlling the population size of...
research
04/07/2020

Self-Adjusting Evolutionary Algorithms for Multimodal Optimization

Recent theoretical research has shown that self-adjusting and self-adapt...
research
04/17/2019

Offspring Population Size Matters when Comparing Evolutionary Algorithms with Self-Adjusting Mutation Rates

We analyze the performance of the 2-rate (1+λ) Evolutionary Algorithm (E...
research
04/01/2022

Self-adjusting Population Sizes for the (1, λ)-EA on Monotone Functions

We study the (1,λ)-EA with mutation rate c/n for c≤ 1, where the populat...
research
04/15/2019

The 1/5-th Rule with Rollbacks: On Self-Adjustment of the Population Size in the (1+(λ,λ)) GA

Self-adjustment of parameters can significantly improve the performance ...
research
12/26/2020

A second-order self-adjusting steepness based remapping method for arbitrary quadrilateral meshes

In this paper, based on the idea of self-adjusting steepness based schem...

Please sign up or login with your details

Forgot password? Click here to reset