A simple parameter-free and adaptive approach to optimization under a minimal local smoothness assumption

10/01/2018
by   Peter L. Bartlett, et al.
0

We study the problem of optimizing a function under a budgeted number of evaluations. We only assume that the function is locally smooth around one of its global optima. The difficulty of optimization is measured in terms of 1) the amount of noise b of the function evaluation and 2) the local smoothness, d, of the function. A smaller d results in smaller optimization error. We come with a new, simple, and parameter-free approach. First, for all values of b and d, this approach recovers at least the state-of-the-art regret guarantees. Second, our approach additionally obtains these results while being agnostic to the values of both b and d. This leads to the first algorithm that naturally adapts to an unknown range of noise b and leads to significant improvements in a moderate and low-noise regime. Third, our approach also obtains a remarkable improvement over the state-of-the-art algorithm when the noise is very low which includes the case of optimization under deterministic feedback (b=0). There, under our minimal local smoothness assumption, this improvement is of exponential magnitude and holds for a class of functions that covers the vast majority of functions that practitioners optimize (d=0). We show that our algorithmic improvement is also borne out in the numerical experiments, where we empirically show faster convergence on common benchmark functions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/01/2021

Distributed Zero-Order Optimization under Adversarial Noise

We study the problem of distributed zero-order optimization for a class ...
research
03/22/2018

Optimization of Smooth Functions with Noisy Observations: Local Minimax Rates

We consider the problem of global optimization of an unknown non-convex ...
research
06/14/2020

Exploiting Higher Order Smoothness in Derivative-free Optimization and Continuous Bandits

We study the problem of zero-order optimization of a strongly convex fun...
research
11/03/2022

Extra-Newton: A First Approach to Noise-Adaptive Accelerated Second-Order Methods

This work proposes a universal and adaptive second-order method for mini...
research
05/11/2020

Multi-Scale Zero-Order Optimization of Smooth Functions in an RKHS

We aim to optimize a black-box function f:XR under the assumption that f...
research
01/18/2011

Convergence rates of efficient global optimization algorithms

Efficient global optimization is the problem of minimizing an unknown fu...
research
04/04/2022

Free Energy Principle for the Noise Smoothness Estimation of Linear Systems with Colored Noise

The free energy principle (FEP) from neuroscience provides a framework c...

Please sign up or login with your details

Forgot password? Click here to reset