DeepAI AI Chat
Log In Sign Up

Noisy Optimization: Convergence with a Fixed Number of Resamplings

04/09/2014
by   Marie-Liesse Cauwet, et al.
0

It is known that evolution strategies in continuous domains might not converge in the presence of noise. It is also known that, under mild assumptions, and using an increasing number of resamplings, one can mitigate the effect of additive noise and recover convergence. We show new sufficient conditions for the convergence of an evolutionary algorithm with constant number of resamplings; in particular, we get fast rates (log-linear convergence) provided that the variance decreases around the optimum slightly faster than in the so-called multiplicative noise model. Keywords: Noisy optimization, evolutionary algorithm, theory.

READ FULL TEXT

page 1

page 2

page 3

page 4

07/04/2012

Sufficient conditions for convergence of Loopy Belief Propagation

We derive novel sufficient conditions for convergence of Loopy Belief Pr...
04/30/2015

Average Convergence Rate of Evolutionary Algorithms

In evolutionary optimization, it is important to understand how fast evo...
03/01/2023

Pathwise Uniform Convergence of Time Discretisation Schemes for SPDEs

In this paper we prove convergence rates for time discretisation schemes...
08/20/2011

Convergence of a Recombination-Based Elitist Evolutionary Algorithm on the Royal Roads Test Function

We present an analysis of the performance of an elitist Evolutionary alg...
11/20/2013

Analyzing Evolutionary Optimization in Noisy Environments

Many optimization tasks have to be handled in noisy environments, where ...
06/18/2019

Nonparametric estimation in a regression model with additive and multiplicative noise

In this paper, we consider an unknown functional estimation problem in a...
07/08/2018

A New Noise-Assistant LMS Algorithm for Preventing the Stalling Effect

In this paper, we introduce a new algorithm to deal with the stalling ef...