A Simple Stochastic Variance Reduced Algorithm with Fast Convergence Rates

06/28/2018
by   Kaiwen Zhou, et al.
0

Recent years have witnessed exciting progress in the study of stochastic variance reduced gradient methods (e.g., SVRG, SAGA), their accelerated variants (e.g, Katyusha) and their extensions in many different settings (e.g., online, sparse, asynchronous, distributed). Among them, accelerated methods enjoy improved convergence rates but have complex coupling structures, which makes them hard to be extended to more settings (e.g., sparse and asynchronous) due to the existence of perturbation. In this paper, we introduce a simple stochastic variance reduced algorithm (MiG), which enjoys the best-known convergence rates for both strongly convex and non-strongly convex problems. Moreover, we also present its efficient sparse and asynchronous variants, and theoretically analyze its convergence rates in these settings. Finally, extensive experiments for various machine learning problems such as logistic regression are given to illustrate the practical improvement in both serial and asynchronous settings.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/11/2017

Accelerated Variance Reduced Stochastic ADMM

Recently, many variance reduced stochastic alternating direction method ...
research
04/01/2017

Stochastic L-BFGS: Improved Convergence Rates and Practical Acceleration Strategies

We revisit the stochastic limited-memory BFGS (L-BFGS) algorithm. By pro...
research
03/30/2023

Sublinear Convergence Rates of Extragradient-Type Methods: A Survey on Classical and Recent Developments

The extragradient (EG), introduced by G. M. Korpelevich in 1976, is a we...
research
11/15/2018

Asynchronous Stochastic Composition Optimization with Variance Reduction

Composition optimization has drawn a lot of attention in a wide variety ...
research
02/26/2018

Guaranteed Sufficient Decrease for Stochastic Variance Reduced Gradient Optimization

In this paper, we propose a novel sufficient decrease technique for stoc...
research
06/14/2023

Contraction Rate Estimates of Stochastic Gradient Kinetic Langevin Integrators

In previous work, we introduced a method for determining convergence rat...
research
06/22/2015

Taming the Wild: A Unified Analysis of Hogwild!-Style Algorithms

Stochastic gradient descent (SGD) is a ubiquitous algorithm for a variet...

Please sign up or login with your details

Forgot password? Click here to reset