DeepAI
Log In Sign Up

Scalable Average Consensus with Compressed Communications

09/14/2021
by   Mohammad Taha Toghani, et al.
13

We propose a new decentralized average consensus algorithm with compressed communication that scales linearly with the network size n. We prove that the proposed method converges to the average of the initial values held locally by the agents of a network when agents are allowed to communicate with compressed messages. The proposed algorithm works for a broad class of compression operators (possibly biased), where agents interact over arbitrary static, undirected, and connected networks. We further present numerical experiments that confirm our theoretical results and illustrate the scalability and communication efficiency of our algorithm.

READ FULL TEXT

page 1

page 2

page 3

page 4

04/18/2022

On Arbitrary Compression for Decentralized Consensus and Stochastic Optimization over Directed Networks

We study the decentralized consensus and stochastic optimization problem...
09/01/2020

Distributed Locally Non-interfering Connectivity via Linear Temporal Logic

In this paper, we consider networks of static sensors with integrated se...
02/01/2019

Decentralized Stochastic Optimization and Gossip Algorithms with Compressed Communication

We consider decentralized stochastic optimization with the objective fun...
11/03/2020

A Linearly Convergent Algorithm for Decentralized Optimization: Sending Less Bits for Free!

Decentralized optimization methods enable on-device training of machine ...
03/08/2018

Distributed Computation of Wasserstein Barycenters over Networks

We propose a new class-optimal algorithm for the distributed computation...
04/29/2018

Randomization and quantization for average consensus

A variety of problems in distributed control involve a networked system ...
10/11/2021

Dynamic Median Consensus Over Random Networks

This paper studies the problem of finding the median of N distinct numbe...