Improving the Convergence Rate of One-Point Zeroth-Order Optimization using Residual Feedback

06/18/2020
by   Yan Zhang, et al.
0

Many existing zeroth-order optimization (ZO) algorithms adopt two-point feedback schemes due to their fast convergence rate compared to one-point feedback schemes. However, two-point schemes require two evaluations of the objective function at each iteration, which can be impractical in applications where the data are not all available a priori, e.g., in online optimization. In this paper, we propose a novel one-point feedback scheme that queries the function value only once at each iteration and estimates the gradient using the residual between two consecutive feedback points. When optimizing a deterministic Lipschitz function, we show that the query complexity of ZO with the proposed one-point residual feedback matches that of ZO with the existing two-point feedback schemes. Moreover, the query complexity of the proposed algorithm can be improved when the objective function has Lipschitz gradient. Then, for stochastic bandit optimization problems, we show that ZO with one-point residual feedback achieves the same convergence rate as that of ZO with two-point feedback with uncontrollable data samples. We demonstrate the effectiveness of the proposed one-point residual feedback via extensive numerical experiments.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/14/2020

Boosting One-Point Derivative-Free Online Optimization via Residual Feedback

Zeroth-order optimization (ZO) typically relies on two-point feedback to...
research
12/11/2021

Convergence Rate Analysis of Accelerated Forward-Backward Algorithm with Generalized Nesterov Momentum Scheme

Nesterov's accelerated forward-backward algorithm (AFBA) is an efficient...
research
08/01/2023

Mirror Natural Evolution Strategies

The zeroth-order optimization has been widely used in machine learning a...
research
11/02/2018

Proximal Gradient Method for Manifold Optimization

This paper considers manifold optimization problems with nonsmooth and n...
research
03/03/2022

Parametric complexity analysis for a class of first-order Adagrad-like algorithms

A class of algorithms for optimization in the presence of noise is prese...
research
08/03/2016

Fast and Simple Optimization for Poisson Likelihood Models

Poisson likelihood models have been prevalently used in imaging, social ...
research
06/14/2022

Lazy Queries Can Reduce Variance in Zeroth-order Optimization

A major challenge of applying zeroth-order (ZO) methods is the high quer...

Please sign up or login with your details

Forgot password? Click here to reset