DIGEST: Fast and Communication Efficient Decentralized Learning with Local Updates

07/14/2023
by   Peyman Gholami, et al.
0

Two widely considered decentralized learning algorithms are Gossip and random walk-based learning. Gossip algorithms (both synchronous and asynchronous versions) suffer from high communication cost, while random-walk based learning experiences increased convergence time. In this paper, we design a fast and communication-efficient asynchronous decentralized learning mechanism DIGEST by taking advantage of both Gossip and random-walk ideas, and focusing on stochastic gradient descent (SGD). DIGEST is an asynchronous decentralized algorithm building on local-SGD algorithms, which are originally designed for communication efficient centralized learning. We design both single-stream and multi-stream DIGEST, where the communication overhead may increase when the number of streams increases, and there is a convergence and communication overhead trade-off which can be leveraged. We analyze the convergence of single- and multi-stream DIGEST, and prove that both algorithms approach to the optimal solution asymptotically for both iid and non-iid data distributions. We evaluate the performance of single- and multi-stream DIGEST for logistic regression and a deep neural network ResNet20. The simulation results confirm that multi-stream DIGEST has nice convergence properties; i.e., its convergence time is better than or comparable to the baselines in iid setting, and outperforms the baselines in non-iid setting.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/03/2020

Private Weighted Random Walk Stochastic Gradient Descent

We consider a decentralized learning setting in which data is distribute...
research
03/24/2022

Locally Asynchronous Stochastic Gradient Descent for Decentralised Deep Learning

Distributed training algorithms of deep neural networks show impressive ...
research
04/18/2018

A Communication-Efficient Random-Walk Algorithm for Decentralized Optimization

This paper addresses consensus optimization problem in a multi-agent net...
research
10/21/2019

Communication Efficient Decentralized Training with Multiple Local Updates

Communication efficiency plays a significant role in decentralized optim...
research
04/18/2018

Walkman: A Communication-Efficient Random-Walk Algorithm for Decentralized Optimization

This paper addresses consensus optimization problems in a multi-agent ne...
research
04/23/2023

An Asynchronous Decentralized Algorithm for Wasserstein Barycenter Problem

Wasserstein Barycenter Problem (WBP) has recently received much attentio...
research
09/15/2022

Efficiency Ordering of Stochastic Gradient Descent

We consider the stochastic gradient descent (SGD) algorithm driven by a ...

Please sign up or login with your details

Forgot password? Click here to reset