Guaranteed Sufficient Decrease for Stochastic Variance Reduced Gradient Optimization

02/26/2018
by   Fanhua Shang, et al.
0

In this paper, we propose a novel sufficient decrease technique for stochastic variance reduced gradient descent methods such as SVRG and SAGA. In order to make sufficient decrease for stochastic optimization, we design a new sufficient decrease criterion, which yields sufficient decrease versions of stochastic variance reduction algorithms such as SVRG-SD and SAGA-SD as a byproduct. We introduce a coefficient to scale current iterate and to satisfy the sufficient decrease property, which takes the decisions to shrink, expand or even move in the opposite direction, and then give two specific update rules of the coefficient for Lasso and ridge regression. Moreover, we analyze the convergence properties of our algorithms for strongly convex problems, which show that our algorithms attain linear convergence rates. We also provide the convergence guarantees of our algorithms for non-strongly convex problems. Our experimental results further verify that our algorithms achieve significantly better performance than their counterparts.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/20/2017

Guaranteed Sufficient Decrease for Variance Reduced Stochastic Gradient Descent

In this paper, we propose a novel sufficient decrease technique for vari...
research
04/17/2017

Larger is Better: The Effect of Learning Rates Enjoyed by Stochastic Optimization with Progressive Variance Reduction

In this paper, we propose a simple variant of the original stochastic va...
research
02/26/2018

VR-SGD: A Simple Stochastic Variance Reduction Method for Machine Learning

In this paper, we propose a simple variant of the original SVRG, called ...
research
06/28/2018

A Simple Stochastic Variance Reduced Algorithm with Fast Convergence Rates

Recent years have witnessed exciting progress in the study of stochastic...
research
10/27/2017

Stochastic Conjugate Gradient Algorithm with Variance Reduction

Conjugate gradient methods are a class of important methods for solving ...
research
12/14/2020

Noisy Linear Convergence of Stochastic Gradient Descent for CV@R Statistical Learning under Polyak-Łojasiewicz Conditions

Conditional Value-at-Risk (CV@R) is one of the most popular measures of ...
research
06/04/2017

Stochastic Reformulations of Linear Systems: Algorithms and Convergence Theory

We develop a family of reformulations of an arbitrary consistent linear ...

Please sign up or login with your details

Forgot password? Click here to reset