A Dimension-Insensitive Algorithm for Stochastic Zeroth-Order Optimization

04/22/2021
by   Hongcheng Liu, et al.
0

This paper concerns a convex, stochastic zeroth-order optimization (S-ZOO) problem, where the objective is to minimize the expectation of a cost function and its gradient is not accessible directly. To solve this problem, traditional optimization techniques mostly yield query complexities that grow polynomially with dimensionality, i.e., the number of function evaluations is a polynomial function of the number of decision variables. Consequently, these methods may not perform well in solving massive-dimensional problems arising in many modern applications. Although more recent methods can be provably dimension-insensitive, almost all of them work with arguably more stringent conditions such as everywhere sparse or compressible gradient. Thus, prior to this research, it was unknown whether dimension-insensitive S-ZOO is possible without such conditions. In this paper, we give an affirmative answer to this question by proposing a sparsity-inducing stochastic gradient-free (SI-SGF) algorithm. It is proved to achieve dimension-insensitive query complexity in both convex and strongly convex cases when neither gradient sparsity nor gradient compressibility is satisfied. Our numerical results demonstrate the strong potential of the proposed SI-SGF compared with existing alternatives.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/15/2020

Improved Complexities for Stochastic Conditional Gradient Methods under Interpolation-like Conditions

We analyze stochastic conditional gradient type methods for constrained ...
research
09/17/2018

Zeroth-order (Non)-Convex Stochastic Optimization via Conditional Gradient and Gradient Updates

In this paper, we propose and analyze zeroth-order stochastic approximat...
research
10/29/2017

Stochastic Zeroth-order Optimization in High Dimensions

We consider the problem of optimizing a high-dimensional convex function...
research
09/18/2020

Hybrid Stochastic-Deterministic Minibatch Proximal Gradient: Less-Than-Single-Pass Optimization with Nearly Optimal Generalization

Stochastic variance-reduced gradient (SVRG) algorithms have been shown t...
research
04/07/2020

Orthant Based Proximal Stochastic Gradient Method for ℓ_1-Regularized Optimization

Sparsity-inducing regularization problems are ubiquitous in machine lear...
research
12/29/2017

A Stochastic Trust Region Algorithm

An algorithm is proposed for solving stochastic and finite sum minimizat...
research
06/12/2018

Sparse Stochastic Zeroth-Order Optimization with an Application to Bandit Structured Prediction

Stochastic zeroth-order (SZO), or gradient-free, optimization allows to ...

Please sign up or login with your details

Forgot password? Click here to reset