Sparse-Push: Communication- Energy-Efficient Decentralized Distributed Learning over Directed Time-Varying Graphs with non-IID Datasets

02/10/2021
by   Sai Aparna Aketi, et al.
0

Current deep learning (DL) systems rely on a centralized computing paradigm which limits the amount of available training data, increases system latency, and adds privacy and security constraints. On-device learning, enabled by decentralized and distributed training of DL models over peer-to-peer wirelessly connected edge devices, not only alleviate the above limitations but also enable next-gen applications that need DL models to continuously interact and learn from their environment. However, this necessitates the development of novel training algorithms that train DL models over time-varying and directed peer-to-peer graph structures while minimizing the amount of communication between the devices and also being resilient to non-IID data distributions. In this work we propose, Sparse-Push, a communication efficient decentralized distributed training algorithm that supports training over peer-to-peer, directed, and time-varying graph topologies. The proposed algorithm enables 466x reduction in communication with only 1 training various DL models such as ResNet-20 and VGG11 over the CIFAR-10 dataset. Further, we demonstrate how communication compression can lead to significant performance degradation in-case of non-IID datasets, and propose Skew-Compensated Sparse Push algorithm that recovers this performance drop while maintaining similar levels of communication compression.

READ FULL TEXT
research
11/17/2021

Low Precision Decentralized Distributed Training over IID and non-IID Data

Decentralized distributed learning is the key to enabling large-scale ma...
research
06/07/2023

Get More for Less in Decentralized Learning Systems

Decentralized learning (DL) systems have been gaining popularity because...
research
11/24/2018

Hydra: A Peer to Peer Distributed Training & Data Collection Framework

The world needs diverse and unbiased data to train deep learning models....
research
12/16/2022

Addressing Data Heterogeneity in Decentralized Learning via Topological Pre-processing

Recently, local peer topology has been shown to influence the overall co...
research
04/29/2020

DeCoRIC: Decentralized Connected Resilient IoT Clustering

Maintaining peer-to-peer connectivity with low energy overhead is a key ...
research
03/02/2021

Cross-Gradient Aggregation for Decentralized Learning from Non-IID data

Decentralized learning enables a group of collaborative agents to learn ...
research
01/29/2023

G-Rank: Unsupervised Continuous Learn-to-Rank for Edge Devices in a P2P Network

Ranking algorithms in traditional search engines are powered by enormous...

Please sign up or login with your details

Forgot password? Click here to reset