An Optimal Hybrid Variance-Reduced Algorithm for Stochastic Composite Nonconvex Optimization

08/20/2020
by   Deyi Liu, et al.
0

In this note we propose a new variant of the hybrid variance-reduced proximal gradient method in [7] to solve a common stochastic composite nonconvex optimization problem under standard assumptions. We simply replace the independent unbiased estimator in our hybrid- SARAH estimator introduced in [7] by the stochastic gradient evaluated at the same sample, leading to the identical momentum-SARAH estimator introduced in [2]. This allows us to save one stochastic gradient per iteration compared to [7], and only requires two samples per iteration. Our algorithm is very simple and achieves optimal stochastic oracle complexity bound in terms of stochastic gradient evaluations (up to a constant factor). Our analysis is essentially inspired by [7], but we do not use two different step-sizes.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/15/2019

Hybrid Stochastic Gradient Descent Algorithms for Stochastic Nonconvex Optimization

We introduce a hybrid stochastic estimator to design stochastic gradient...
research
07/08/2019

A Hybrid Stochastic Optimization Framework for Stochastic Composite Nonconvex Optimization

In this paper, we introduce a new approach to develop stochastic optimiz...
research
06/20/2018

Stochastic Nested Variance Reduction for Nonconvex Optimization

We study finite-sum nonconvex optimization problems, where the objective...
research
02/07/2019

Momentum Schemes with Stochastic Variance Reduction for Nonconvex Composite Optimization

Two new stochastic variance-reduced algorithms named SARAH and SPIDER ha...
research
10/04/2016

A SMART Stochastic Algorithm for Nonconvex Optimization with Applications to Robust Machine Learning

In this paper, we show how to transform any optimization problem that ar...
research
05/31/2020

Momentum-based variance-reduced proximal stochastic gradient method for composite nonconvex stochastic optimization

Stochastic gradient methods (SGMs) have been extensively used for solvin...
research
10/25/2018

SpiderBoost: A Class of Faster Variance-reduced Algorithms for Nonconvex Optimization

There has been extensive research on developing stochastic variance redu...

Please sign up or login with your details

Forgot password? Click here to reset