Innovation Compression for Communication-efficient Distributed Optimization with Linear Convergence

05/14/2021
by   Jiaqi Zhang, et al.
10

Information compression is essential to reduce communication cost in distributed optimization over peer-to-peer networks. This paper proposes a communication-efficient linearly convergent distributed (COLD) algorithm to solve strongly convex optimization problems. By compressing innovation vectors, which are the differences between decision vectors and their estimates, COLD is able to achieve linear convergence for a class of δ-contracted compressors. We explicitly quantify how the compression affects the convergence rate and show that COLD matches the same rate of its uncompressed version. To accommodate a wider class of compressors that includes the binary quantizer, we further design a novel dynamical scaling mechanism and obtain the linearly convergent Dyna-COLD. Importantly, our results strictly improve existing results for the quantized consensus problem. Numerical experiments demonstrate the advantages of both algorithms under different compressors.

READ FULL TEXT

page 2

page 3

page 4

page 6

page 7

page 8

page 9

page 10

research
11/15/2022

Linear Convergent Distributed Nash Equilibrium Seeking with Compression

Information compression techniques are often employed to reduce communic...
research
03/25/2021

Compressed Gradient Tracking Methods for Decentralized Optimization with Linear Convergence

Communication compression techniques are of growing interests for solvin...
research
04/18/2022

On Arbitrary Compression for Decentralized Consensus and Stochastic Optimization over Directed Networks

We study the decentralized consensus and stochastic optimization problem...
research
07/01/2020

Linear Convergent Decentralized Optimization with Compression

Communication compression has been extensively adopted to speed up large...
research
02/18/2020

Distributed Adaptive Newton Methods with Globally Superlinear Convergence

This paper considers the distributed optimization problem over a network...
research
07/23/2021

Finite-Bit Quantization For Distributed Algorithms With Linear Convergence

This paper studies distributed algorithms for (strongly convex) composit...
research
07/26/2020

Convex Decreasing Algorithms: Distributed Synthesis and Finite-time Termination in Higher Dimension

We introduce a general mathematical framework for distributed algorithms...

Please sign up or login with your details

Forgot password? Click here to reset