Convergence rate of a simulated annealing algorithm with noisy observations

03/01/2017
by   Clément Bouttier, et al.
0

In this paper we propose a modified version of the simulated annealing algorithm for solving a stochastic global optimization problem. More precisely, we address the problem of finding a global minimizer of a function with noisy evaluations. We provide a rate of convergence and its optimized parametrization to ensure a minimal number of evaluations for a given accuracy and a confidence level close to 1. This work is completed with a set of numerical experimentations and assesses the practical performance both on benchmark test cases and on real world examples.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/13/2022

Convergence of Simulated Annealing Using Kinetic Langevin Dynamics

We study the simulated annealing algorithm based on the kinetic Langevin...
research
09/19/2007

Simulated Annealing: Rigorous finite-time guarantees for optimization on continuous domains

Simulated annealing is a popular method for approaching the solution of ...
research
02/03/2021

Simulated annealing from continuum to discretization: a convergence analysis via the Eyring–Kramers law

We study the convergence rate of continuous-time simulated annealing (X_...
research
02/08/2023

Adaptive State-Dependent Diffusion for Derivative-Free Optimization

This paper develops and analyzes a stochastic derivative-free optimizati...
research
01/30/2023

NPSA: Nonparametric Simulated Annealing for Global Optimization

In this paper we describe NPSA, the first parallel nonparametric global ...
research
06/04/2020

Sample Efficient Graph-Based Optimization with Noisy Observations

We study sample complexity of optimizing "hill-climbing friendly" functi...

Please sign up or login with your details

Forgot password? Click here to reset