Exploiting Higher Order Smoothness in Derivative-free Optimization and Continuous Bandits

06/14/2020
by   Arya Akhavan, et al.
0

We study the problem of zero-order optimization of a strongly convex function. The goal is to find the minimizer of the function by a sequential exploration of its values, under measurement noise. We study the impact of higher order smoothness properties of the function on the optimization error and on the cumulative regret. To solve this problem we consider a randomized approximation of the projected gradient descent algorithm. The gradient is estimated by a randomized procedure involving two function evaluations and a smoothing kernel. We derive upper bounds for this algorithm both in the constrained and unconstrained settings and prove minimax lower bounds for any sequential search method. Our results imply that the zero-order algorithm is nearly optimal in terms of sample complexity and the problem parameters. Based on this algorithm, we also propose an estimator of the minimum value of the function achieving almost sharp oracle behavior. We compare our results with the state-of-the-art, highlighting a number of key improvements.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/01/2021

Distributed Zero-Order Optimization under Adversarial Noise

We study the problem of distributed zero-order optimization for a class ...
research
03/08/2021

On the Oracle Complexity of Higher-Order Smooth Non-Convex Finite-Sum Optimization

We prove lower bounds for higher-order methods in smooth non-convex fini...
research
10/27/2017

Lower Bounds for Higher-Order Convex Optimization

State-of-the-art methods in convex and non-convex optimization employ hi...
research
10/01/2018

A simple parameter-free and adaptive approach to optimization under a minimal local smoothness assumption

We study the problem of optimizing a function under a budgeted number of...
research
06/21/2023

Optimal Algorithms for Stochastic Bilevel Optimization under Relaxed Smoothness Conditions

Stochastic Bilevel optimization usually involves minimizing an upper-lev...
research
06/03/2023

Gradient-free optimization of highly smooth functions: improved analysis and a new algorithm

This work studies minimization problems with zero-order noisy oracle inf...
research
07/19/2021

High-Dimensional Simulation Optimization via Brownian Fields and Sparse Grids

High-dimensional simulation optimization is notoriously challenging. We ...

Please sign up or login with your details

Forgot password? Click here to reset