How is Distributed ADMM Affected by Network Topology?

10/02/2017
by   Guilherme França, et al.
0

When solving consensus optimization problems over a graph, there is often an explicit characterization of the convergence rate of Gradient Descent (GD) using the spectrum of the graph Laplacian. The same type of problems under the Alternating Direction Method of Multipliers (ADMM) are, however, poorly understood. For instance, simple but important non-strongly-convex consensus problems have not yet being analyzed, especially concerning the dependency of the convergence rate on the graph topology. Recently, for a non-strongly-convex consensus problem, a connection between distributed ADMM and lifted Markov chains was proposed, followed by a conjecture that ADMM is faster than GD by a square root factor in its convergence time, in close analogy to the mixing speedup achieved by lifting several Markov chains. Nevertheless, a proof of such a claim is is still lacking. Here we provide a full characterization of the convergence of distributed over-relaxed ADMM for the same type of consensus problem in terms of the topology of the underlying graph. Our results provide explicit formulas for optimal parameter selection in terms of the second largest eigenvalue of the transition matrix of the graph's random walk. Another consequence of our results is a proof of the aforementioned conjecture, which interestingly, we show it is valid for any graph, even the ones whose random walks cannot be accelerated via Markov chain lifting.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/05/2020

Distributed Optimization, Averaging via ADMM, and Network Topology

There has been an increasing necessity for scalable optimization methods...
research
06/09/2017

Adaptive Consensus ADMM for Distributed Optimization

The alternating direction method of multipliers (ADMM) is commonly used ...
research
03/10/2017

Markov Chain Lifting and Distributed ADMM

The time to converge to the steady state of a finite Markov chain can be...
research
03/10/2017

Tuning Over-Relaxed ADMM

The framework of Integral Quadratic Constraints (IQC) reduces the comput...
research
12/07/2015

An Explicit Rate Bound for the Over-Relaxed ADMM

The framework of Integral Quadratic Constraints of Lessard et al. (2014)...
research
12/20/2018

Using First Hitting Times to Find Sets that Maximize the Convergence Rate to Consensus

In a model of communication in a social network described by a simple co...
research
11/08/2019

Learning-Accelerated ADMM for Distributed Optimal Power Flow

We propose a novel data-driven method to accelerate the convergence of A...

Please sign up or login with your details

Forgot password? Click here to reset