Log In Sign Up

Scalable Average Consensus with Compressed Communications

by   Mohammad Taha Toghani, et al.

We propose a new decentralized average consensus algorithm with compressed communication that scales linearly with the network size n. We prove that the proposed method converges to the average of the initial values held locally by the agents of a network when agents are allowed to communicate with compressed messages. The proposed algorithm works for a broad class of compression operators (possibly biased), where agents interact over arbitrary static, undirected, and connected networks. We further present numerical experiments that confirm our theoretical results and illustrate the scalability and communication efficiency of our algorithm.


page 1

page 2

page 3

page 4


On Arbitrary Compression for Decentralized Consensus and Stochastic Optimization over Directed Networks

We study the decentralized consensus and stochastic optimization problem...

Distributed Locally Non-interfering Connectivity via Linear Temporal Logic

In this paper, we consider networks of static sensors with integrated se...

Decentralized Stochastic Optimization and Gossip Algorithms with Compressed Communication

We consider decentralized stochastic optimization with the objective fun...

A Linearly Convergent Algorithm for Decentralized Optimization: Sending Less Bits for Free!

Decentralized optimization methods enable on-device training of machine ...

Distributed Computation of Wasserstein Barycenters over Networks

We propose a new class-optimal algorithm for the distributed computation...

Randomization and quantization for average consensus

A variety of problems in distributed control involve a networked system ...

Dynamic Median Consensus Over Random Networks

This paper studies the problem of finding the median of N distinct numbe...