Noisy Optimization: Convergence with a Fixed Number of Resamplings

04/09/2014
by   Marie-Liesse Cauwet, et al.
0

It is known that evolution strategies in continuous domains might not converge in the presence of noise. It is also known that, under mild assumptions, and using an increasing number of resamplings, one can mitigate the effect of additive noise and recover convergence. We show new sufficient conditions for the convergence of an evolutionary algorithm with constant number of resamplings; in particular, we get fast rates (log-linear convergence) provided that the variance decreases around the optimum slightly faster than in the so-called multiplicative noise model. Keywords: Noisy optimization, evolutionary algorithm, theory.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset