Accelerated Primal-Dual Algorithms for Distributed Smooth Convex Optimization over Networks

10/23/2019
by   Jinming Xu, et al.
0

This paper proposes a novel family of primal-dual-based distributed algorithms for smooth, convex, multi-agent optimization over networks that uses only gradient information and gossip communications. The algorithms can also employ acceleration on the computation and communications. We provide a unified analysis of their convergence rate, measured in terms of the Bregman distance associated to the saddle point reformation of the distributed optimization problem. When acceleration is employed, the rate is shown to be optimal, in the sense that it matches (under the proposed metric) existing complexity lower bounds of distributed algorithms applicable to such a class of problem and using only gradient information and gossip communications. Preliminary numerical results on distributed least-square regression problems show that the proposed algorithm compares favorably on existing distributed schemes.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/25/2018

Gradient Primal-Dual Algorithm Converges to Second-Order Stationary Solutions for Nonconvex Distributed Optimization

In this work, we study two first-order primal-dual based algorithms, the...
research
04/18/2017

Accelerated Distributed Dual Averaging over Evolving Networks of Growing Connectivity

We consider the problem of accelerating distributed optimization in mult...
research
09/05/2022

DISA: A Dual Inexact Splitting Algorithm for Distributed Convex Composite Optimization

This paper proposes a novel dual inexact splitting algorithm (DISA) for ...
research
10/29/2018

Distributed Convex Optimization With Limited Communications

In this paper, a distributed convex optimization algorithm, termed distr...
research
10/02/2020

Distributed Proximal Splitting Algorithms with Rates and Acceleration

We analyze several generic proximal splitting algorithms well suited for...
research
07/24/2021

Distributed stochastic inertial methods with delayed derivatives

Stochastic gradient methods (SGMs) are predominant approaches for solvin...
research
06/11/2020

IDEAL: Inexact DEcentralized Accelerated Augmented Lagrangian Method

We introduce a framework for designing primal methods under the decentra...

Please sign up or login with your details

Forgot password? Click here to reset