Decentralized Composite Optimization with Compression

08/10/2021
by   Yao Li, et al.
0

Decentralized optimization and communication compression have exhibited their great potential in accelerating distributed machine learning by mitigating the communication bottleneck in practice. While existing decentralized algorithms with communication compression mostly focus on the problems with only smooth components, we study the decentralized stochastic composite optimization problem with a potentially non-smooth component. A Proximal gradient LinEAr convergent Decentralized algorithm with compression, Prox-LEAD, is proposed with rigorous theoretical analyses in the general stochastic setting and the finite-sum setting. Our theorems indicate that Prox-LEAD works with arbitrary compression precision, and it tremendously reduces the communication cost almost for free. The superiorities of the proposed algorithms are demonstrated through the comparison with state-of-the-art algorithms in terms of convergence complexities and numerical experiments. Our algorithmic framework also generally enlightens the compressed communication on other primal-dual algorithms by reducing the impact of inexact iterations, which might be of independent interest.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/28/2022

Stochastic Gradient Methods with Compressed Communication for Decentralized Saddle Point Problems

We propose two stochastic gradient algorithms to solve a class of saddle...
research
11/03/2020

A Linearly Convergent Algorithm for Decentralized Optimization: Sending Less Bits for Free!

Decentralized optimization methods enable on-device training of machine ...
research
10/09/2021

An Empirical Study on Compressed Decentralized Stochastic Gradient Algorithms with Overparameterized Models

This paper considers decentralized optimization with application to mach...
research
11/20/2020

On the Benefits of Multiple Gossip Steps in Communication-Constrained Decentralized Optimization

In decentralized optimization, it is common algorithmic practice to have...
research
05/08/2022

Communication Compression for Decentralized Learning with Operator Splitting Methods

In decentralized learning, operator splitting methods using a primal-dua...
research
07/01/2020

Linear Convergent Decentralized Optimization with Compression

Communication compression has been extensively adopted to speed up large...
research
06/21/2020

Optimal and Practical Algorithms for Smooth and Strongly Convex Decentralized Optimization

We consider the task of decentralized minimization of the sum of smooth ...

Please sign up or login with your details

Forgot password? Click here to reset