Communication Compression for Decentralized Learning with Operator Splitting Methods

05/08/2022
by   Yuki Takezawa, et al.
0

In decentralized learning, operator splitting methods using a primal-dual formulation (e.g., the Edge-Consensus Learning (ECL)) has been shown to be robust to heterogeneous data and has attracted significant attention in recent years. However, in the ECL, a node needs to exchange dual variables with its neighbors. These exchanges incur significant communication costs. For the Gossip-based algorithms, many compression methods have been proposed, but these Gossip-based algorithm do not perform well when the data distribution held by each node is statistically heterogeneous. In this work, we propose the novel framework of the compression methods for the ECL, called the Communication Compressed ECL (C-ECL). Specifically, we reformulate the update formulas of the ECL, and propose to compress the update values of the dual variables. We demonstrate experimentally that the C-ECL can achieve a nearly equivalent performance with fewer parameter exchanges than the ECL. Moreover, we demonstrate that the C-ECL is more robust to heterogeneous data than the Gossip-based algorithms.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/23/2022

Theoretical Analysis of Primal-Dual Algorithm for Non-Convex Stochastic Decentralized Optimization

In recent years, decentralized learning has emerged as a powerful tool n...
research
08/02/2022

Stochastic Primal-Dual Three Operator Splitting with Arbitrary Sampling and Preconditioning

In this work we propose a stochastic primal-dual preconditioned three-op...
research
08/10/2021

Decentralized Composite Optimization with Compression

Decentralized optimization and communication compression have exhibited ...
research
08/16/2023

DFedADMM: Dual Constraints Controlled Model Inconsistency for Decentralized Federated Learning

To address the communication burden issues associated with federated lea...
research
05/31/2022

Communication-Efficient Distributionally Robust Decentralized Learning

Decentralized learning algorithms empower interconnected edge devices to...
research
02/05/2022

Communication Efficient Federated Learning via Ordered ADMM in a Fully Decentralized Setting

The challenge of communication-efficient distributed optimization has at...

Please sign up or login with your details

Forgot password? Click here to reset