An Accelerated Method for Derivative-Free Smooth Stochastic Convex Optimization

02/25/2018
by   Pavel Dvurechensky, et al.
0

We consider an unconstrained problem of minimization of a smooth convex function which is only available through noisy observations of its values, the noise consisting of two parts. Similar to stochastic optimization problems, the first part is of a stochastic nature. On the opposite, the second part is an additive noise of an unknown nature, but bounded in the absolute value. In the two-point feedback setting, i.e. when pairs of function values are available, we propose an accelerated derivative-free algorithm together with its complexity analysis. The complexity bound of our derivative-free algorithm is only by a factor of √(n) larger than the bound for accelerated gradient-based algorithms, where n is the dimension of the decision variable. We also propose a non-accelerated derivative-free algorithm with a complexity bound similar to the stochastic-gradient-based algorithm, that is, our bound does not have any dimension-dependent factor. Interestingly, if the solution of the problem is sparse, for both our algorithms, we obtain better complexity bound if the algorithm uses a 1-norm proximal setup, rather than the Euclidean proximal setup, which is a standard choice for unconstrained problems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/08/2018

An Accelerated Directional Derivative Method for Smooth Stochastic Convex Optimization

We consider smooth stochastic convex optimization problems in the contex...
research
01/25/2019

A Laplacian Approach to ℓ_1-Norm Minimization

We propose a novel differentiable reformulation of the linearly-constrai...
research
09/11/2012

On the Complexity of Bandit and Derivative-Free Stochastic Convex Optimization

The problem of stochastic convex optimization with bandit feedback (in t...
research
11/15/2017

Random gradient extrapolation for distributed and stochastic optimization

In this paper, we consider a class of finite-sum convex optimization pro...
research
07/31/2019

Robust stochastic optimization with the proximal point method

Standard results in stochastic convex optimization bound the number of s...

Please sign up or login with your details

Forgot password? Click here to reset