On Arbitrary Compression for Decentralized Consensus and Stochastic Optimization over Directed Networks

04/18/2022
by   Mohammad Taha Toghani, et al.
1

We study the decentralized consensus and stochastic optimization problems with compressed communications over static directed graphs. We propose an iterative gradient-based algorithm that compresses messages according to a desired compression ratio. The proposed method provably reduces the communication overhead on the network at every communication round. Contrary to existing literature, we allow for arbitrary compression ratios in the communicated messages. We show a linear convergence rate for the proposed method on the consensus problem. Moreover, we provide explicit convergence rates for decentralized stochastic optimization problems on smooth functions that are either (i) strongly convex, (ii) convex, or (iii) non-convex. Finally, we provide numerical experiments to illustrate convergence under arbitrary compression ratios and the communication efficiency of our algorithm.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/01/2019

Decentralized Stochastic Optimization and Gossip Algorithms with Compressed Communication

We consider decentralized stochastic optimization with the objective fun...
research
09/06/2023

Adaptive Consensus: A network pruning approach for decentralized optimization

We consider network-based decentralized optimization problems, where eac...
research
09/14/2021

Scalable Average Consensus with Compressed Communications

We propose a new decentralized average consensus algorithm with compress...
research
10/05/2022

Personalized Decentralized Bilevel Optimization over Stochastic and Directed Networks

While personalization in distributed learning has been extensively studi...
research
12/10/2018

Compressed Distributed Gradient Descent: Communication-Efficient Consensus over Networks

Network consensus optimization has received increasing attention in rece...
research
05/14/2021

Innovation Compression for Communication-efficient Distributed Optimization with Linear Convergence

Information compression is essential to reduce communication cost in dis...
research
11/03/2020

A Linearly Convergent Algorithm for Decentralized Optimization: Sending Less Bits for Free!

Decentralized optimization methods enable on-device training of machine ...

Please sign up or login with your details

Forgot password? Click here to reset