Accelerated Variance Reduced Stochastic ADMM

07/11/2017
by   Yuanyuan Liu, et al.
0

Recently, many variance reduced stochastic alternating direction method of multipliers (ADMM) methods (e.g. SAG-ADMM, SDCA-ADMM and SVRG-ADMM) have made exciting progress such as linear convergence rates for strongly convex problems. However, the best known convergence rate for general convex problems is O(1/T) as opposed to O(1/T^2) of accelerated batch algorithms, where T is the number of iterations. Thus, there still remains a gap in convergence rates between existing stochastic ADMM and batch algorithms. To bridge this gap, we introduce the momentum acceleration trick for batch optimization into the stochastic variance reduced gradient based ADMM (SVRG-ADMM), which leads to an accelerated (ASVRG-ADMM) method. Then we design two different momentum term update rules for strongly convex and general convex cases. We prove that ASVRG-ADMM converges linearly for strongly convex problems. Besides having a low per-iteration complexity as existing stochastic ADMM methods, ASVRG-ADMM improves the convergence rate on general convex problems from O(1/T) to O(1/T^2). Our experimental results show the effectiveness of ASVRG-ADMM.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/06/2020

Convergence rates for an inexact ADMM applied to separable convex optimization

Convergence rates are established for an inexact accelerated alternating...
research
04/24/2016

Stochastic Variance-Reduced ADMM

The alternating direction method of multipliers (ADMM) is a powerful opt...
research
11/14/2017

Stochastic Strictly Contractive Peaceman-Rachford Splitting Method

In this paper, we propose a couple of new Stochastic Strictly Contractiv...
research
05/11/2017

Fast Stochastic Variance Reduced ADMM for Stochastic Composition Optimization

We consider the stochastic composition optimization problem proposed in ...
research
06/28/2018

A Simple Stochastic Variance Reduced Algorithm with Fast Convergence Rates

Recent years have witnessed exciting progress in the study of stochastic...
research
03/18/2020

Accelerated ADMM based Trajectory Optimization for Legged Locomotion with Coupled Rigid Body Dynamics

Trajectory optimization is becoming increasingly powerful in addressing ...
research
08/13/2018

Relax, and Accelerate: A Continuous Perspective on ADMM

The acceleration technique first introduced by Nesterov for gradient des...

Please sign up or login with your details

Forgot password? Click here to reset