Relax, and Accelerate: A Continuous Perspective on ADMM

08/13/2018
by   Guilherme França, et al.
0

The acceleration technique first introduced by Nesterov for gradient descent is widely used in many machine learning applications, however it is not yet well-understood. Recently, significant progress has been made to close this understanding gap by using a continuous-time dynamical system perspective associated with gradient-based methods. In this paper, we extend this perspective by considering the continuous limit of the family of relaxed Alternating Direction Method of Multipliers (ADMM). We also introduce two new families of relaxed and accelerated ADMM algorithms, one follows Nesterov's acceleration approach and the other is inspired by Polyak's Heavy Ball method, and derive the continuous limit of these families of relaxed and accelerated algorithms as differential equations. Then, using a Lyapunov stability analysis of the dynamical systems, we obtain rate-of-convergence results for convex and strongly convex objective functions.

READ FULL TEXT
research
04/10/2017

Adaptive Relaxed ADMM: Convergence Theory and Practical Implementation

Many modern computer vision and machine learning applications rely on so...
research
07/11/2017

Accelerated Variance Reduced Stochastic ADMM

Recently, many variance reduced stochastic alternating direction method ...
research
08/02/2019

Gradient Flows and Accelerated Proximal Splitting Methods

Proximal based methods are well-suited to nonsmooth optimization problem...
research
03/14/2016

A Variational Perspective on Accelerated Methods in Optimization

Accelerated gradient methods play a central role in optimization, achiev...
research
03/07/2020

Stochastic Modified Equations for Continuous Limit of Stochastic ADMM

Stochastic version of alternating direction method of multiplier (ADMM) ...
research
03/10/2017

Tuning Over-Relaxed ADMM

The framework of Integral Quadratic Constraints (IQC) reduces the comput...
research
02/10/2018

On Symplectic Optimization

Accelerated gradient methods have had significant impact in machine lear...

Please sign up or login with your details

Forgot password? Click here to reset