DeepAI AI Chat
Log In Sign Up

Noisy Optimization: Convergence with a Fixed Number of Resamplings

by   Marie-Liesse Cauwet, et al.

It is known that evolution strategies in continuous domains might not converge in the presence of noise. It is also known that, under mild assumptions, and using an increasing number of resamplings, one can mitigate the effect of additive noise and recover convergence. We show new sufficient conditions for the convergence of an evolutionary algorithm with constant number of resamplings; in particular, we get fast rates (log-linear convergence) provided that the variance decreases around the optimum slightly faster than in the so-called multiplicative noise model. Keywords: Noisy optimization, evolutionary algorithm, theory.


page 1

page 2

page 3

page 4


Sufficient conditions for convergence of Loopy Belief Propagation

We derive novel sufficient conditions for convergence of Loopy Belief Pr...

Average Convergence Rate of Evolutionary Algorithms

In evolutionary optimization, it is important to understand how fast evo...

Pathwise Uniform Convergence of Time Discretisation Schemes for SPDEs

In this paper we prove convergence rates for time discretisation schemes...

Convergence of a Recombination-Based Elitist Evolutionary Algorithm on the Royal Roads Test Function

We present an analysis of the performance of an elitist Evolutionary alg...

Analyzing Evolutionary Optimization in Noisy Environments

Many optimization tasks have to be handled in noisy environments, where ...

Nonparametric estimation in a regression model with additive and multiplicative noise

In this paper, we consider an unknown functional estimation problem in a...

A New Noise-Assistant LMS Algorithm for Preventing the Stalling Effect

In this paper, we introduce a new algorithm to deal with the stalling ef...