Dissipativity Theory for Accelerating Stochastic Variance Reduction: A Unified Analysis of SVRG and Katyusha Using Semidefinite Programs

06/10/2018
by   Bin Hu, et al.
0

Techniques for reducing the variance of gradient estimates used in stochastic programming algorithms for convex finite-sum problems have received a great deal of attention in recent years. By leveraging dissipativity theory from control, we provide a new perspective on two important variance-reduction algorithms: SVRG and its direct accelerated variant Katyusha. Our perspective provides a physically intuitive understanding of the behavior of SVRG-like methods via a principle of energy conservation. The tools discussed here allow us to automate the convergence analysis of SVRG-like methods by capturing their essential properties in small semidefinite programs amenable to standard analysis and computational techniques. Our approach recovers existing convergence results for SVRG and Katyusha and generalizes the theory to alternative parameter choices. We also discuss how our approach complements the linear coupling technique. Our combination of perspectives leads to a better understanding of accelerated variance-reduced stochastic methods for finite-sum problems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/28/2018

Direct Acceleration of SAGA using Sampled Negative Momentum

Variance reduction is a simple and effective technique that accelerates ...
research
05/29/2019

A unified variance-reduced accelerated gradient method for convex optimization

We propose a novel randomized incremental gradient algorithm, namely, VA...
research
06/29/2022

Weighted ensemble: Recent mathematical developments

The weighted ensemble (WE) method, an enhanced sampling approach based o...
research
12/26/2020

Variance Reduction on Adaptive Stochastic Mirror Descent

We study the idea of variance reduction applied to adaptive stochastic m...
research
01/28/2022

Adaptive Accelerated (Extra-)Gradient Methods with Variance Reduction

In this paper, we study the finite-sum convex optimization problem focus...
research
02/11/2020

Variance Reduced Coordinate Descent with Acceleration: New Method With a Surprising Application to Finite-Sum Problems

We propose an accelerated version of stochastic variance reduced coordin...
research
05/05/2020

Variance Reduction for Sequential Sampling in Stochastic Programming

This paper investigates the variance reduction techniques Antithetic Var...

Please sign up or login with your details

Forgot password? Click here to reset