Compressed Distributed Gradient Descent: Communication-Efficient Consensus over Networks

12/10/2018
by   Xin Zhang, et al.
0

Network consensus optimization has received increasing attention in recent years and has found important applications in many scientific and engineering fields. To solve network consensus optimization problems, one of the most well-known approaches is the distributed gradient descent method (DGD). However, in networks with slow communication rates, DGD's performance is unsatisfactory for solving high-dimensional network consensus problems due to the communication bottleneck. This motivates us to design a communication-efficient DGD-type algorithm based on compressed information exchanges. Our contributions in this paper are three-fold: i) We develop a communication-efficient algorithm called amplified-differential compression DGD (ADC-DGD) and show that it converges under any unbiased compression operator; ii) We rigorously prove the convergence performances of ADC-DGD and show that they match with those of DGD without compression; iii) We reveal an interesting phase transition phenomenon in the convergence speed of ADC-DGD. Collectively, our findings advance the state-of-the-art of network consensus optimization theory.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/01/2019

Decentralized Stochastic Optimization and Gossip Algorithms with Compressed Communication

We consider decentralized stochastic optimization with the objective fun...
research
12/06/2019

Communication-Efficient Network-Distributed Optimization with Differential-Coded Compressors

Network-distributed optimization has attracted significant attention in ...
research
04/18/2022

On Arbitrary Compression for Decentralized Consensus and Stochastic Optimization over Directed Networks

We study the decentralized consensus and stochastic optimization problem...
research
09/04/2017

Abstraction of Linear Consensus Networks with Guaranteed Systemic Performance Measures

A proper abstraction of a large-scale linear consensus network with a de...
research
03/10/2020

Communication-efficient Variance-reduced Stochastic Gradient Descent

We consider the problem of communication efficient distributed optimizat...
research
11/20/2022

Higher-order interaction model from geometric measurements

We introduce a higher simplicial generalization of the linear consensus ...
research
05/20/2018

Communication-Efficient Projection-Free Algorithm for Distributed Optimization

Distributed optimization has gained a surge of interest in recent years....

Please sign up or login with your details

Forgot password? Click here to reset