Zeroth-Order Stochastic Variance Reduction for Nonconvex Optimization

05/25/2018
by   Sijia Liu, et al.
0

As application demands for zeroth-order (gradient-free) optimization accelerate, the need for variance reduced and faster converging approaches is also intensifying. This paper addresses these challenges by presenting: a) a comprehensive theoretical analysis of variance reduced zeroth-order (ZO) optimization, b) a novel variance reduced ZO algorithm, called ZO-SVRG, and c) an experimental evaluation of our approach in the context of two compelling applications, black-box chemical material classification and generation of adversarial examples from black-box deep neural network models. Our theoretical analysis uncovers an essential difficulty in the analysis of ZO-SVRG: the unbiased assumption on gradient estimates no longer holds. We prove that compared to its first-order counterpart, ZO-SVRG with a two-point random gradient estimator suffers an additional error of order O(1/b), where b is the mini-batch size. To mitigate this error, we propose two accelerated versions of ZO-SVRG utilizing variance reduced gradient estimators, which achieve the best rate known for ZO stochastic optimization (in terms of iterations). Our extensive experimental results show that our approaches outperform other state-of-the-art ZO algorithms, and strike a balance between the convergence rate and the function query complexity.

READ FULL TEXT
research
02/16/2019

Faster Gradient-Free Proximal Stochastic Methods for Nonconvex Nonsmooth Optimization

Proximal gradient method has been playing an important role to solve man...
research
12/21/2020

Zeroth-Order Hybrid Gradient Descent: Towards A Principled Black-Box Optimization Framework

In this work, we focus on the study of stochastic zeroth-order (ZO) opti...
research
05/30/2018

Stochastic Zeroth-order Optimization via Variance Reduction method

Derivative-free optimization has become an important technique used in m...
research
10/27/2019

Improved Zeroth-Order Variance Reduced Algorithms and Analysis for Nonconvex Optimization

Two types of zeroth-order stochastic algorithms have recently been desig...
research
12/29/2018

Hessian-Aware Zeroth-Order Optimization for Black-Box Adversarial Attack

Zeroth-order optimization or derivative-free optimization is an importan...
research
05/25/2018

A New Analysis of Variance Reduced Stochastic Proximal Methods for Composite Optimization with Serial and Asynchronous Realizations

We provide a comprehensive analysis of stochastic variance reduced gradi...
research
07/21/2021

On the Convergence of Prior-Guided Zeroth-Order Optimization Algorithms

Zeroth-order (ZO) optimization is widely used to handle challenging task...

Please sign up or login with your details

Forgot password? Click here to reset