A Canonical Form for First-Order Distributed Optimization Algorithms

09/24/2018
by   Akhil Sundararajan, et al.
0

We consider the distributed optimization problem in which a network of agents aims to minimize the average of local functions. To solve this problem, several algorithms have recently been proposed wherein agents perform various combinations of communication with neighbors, local gradient computations, and updates to local state variables. In this paper, we present a canonical form that characterizes any first-order distributed algorithm that can be implemented using a single round of communication and gradient computation per iteration, and where each agent stores up to two state variables. The canonical form features a minimal set of parameters that are both unique and expressive enough to capture any distributed algorithm in this class. The generic nature of our canonical form enables the systematic analysis and design of distributed optimization algorithms.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/21/2023

Distributed Conjugate Gradient Method via Conjugate Direction Tracking

We present a distributed conjugate gradient method for distributed optim...
research
06/07/2021

Asynchronous Distributed Optimization with Redundancy in Cost Functions

This paper considers the problem of asynchronous distributed multi-agent...
research
04/05/2021

Self-Healing First-Order Distributed Optimization

In this paper we describe a parameterized family of first-order distribu...
research
11/06/2019

DISROPT: a Python Framework for Distributed Optimization

In this paper we introduce DISROPT, a Python package for distributed opt...
research
08/14/2023

Self-Healing First-Order Distributed Optimization with Packet Loss

We describe SH-SVL, a parameterized family of first-order distributed op...
research
01/27/2022

Distributed gradient-based optimization in the presence of dependent aperiodic communication

Iterative distributed optimization algorithms involve multiple agents th...
research
03/04/2022

Triggered Gradient Tracking for Asynchronous Distributed Optimization

This paper proposes Asynchronous Triggered Gradient Tracking, i.e., a di...

Please sign up or login with your details

Forgot password? Click here to reset