Communication-Efficient Distributionally Robust Decentralized Learning

05/31/2022
by   Matteo Zecchin, et al.
0

Decentralized learning algorithms empower interconnected edge devices to share data and computational resources to collaboratively train a machine learning model without the aid of a central coordinator (e.g. an orchestrating basestation). In the case of heterogeneous data distributions at the network devices, collaboration can yield predictors with unsatisfactory performance for a subset of the devices. For this reason, in this work we consider the formulation of a distributionally robust decentralized learning task and we propose a decentralized single loop gradient descent/ascent algorithm (AD-GDA) to solve the underlying minimax optimization problem. We render our algorithm communication efficient by employing a compressed consensus scheme and we provide convergence guarantees for smooth convex and non-convex loss functions. Finally, we corroborate the theoretical findings with empirical evidence of the ability of the proposed algorithm in providing unbiased predictors over a network of collaborating devices with highly heterogeneous data distributions.

READ FULL TEXT
research
08/13/2018

COLA: Communication-Efficient Decentralized Linear Learning

Decentralized machine learning is a promising emerging paradigm in view ...
research
07/31/2022

Online Decentralized Frank-Wolfe: From theoretical bound to applications in smart-building

The design of decentralized learning algorithms is important in the fast...
research
05/23/2022

Theoretical Analysis of Primal-Dual Algorithm for Non-Convex Stochastic Decentralized Optimization

In recent years, decentralized learning has emerged as a powerful tool n...
research
10/14/2021

Resource-constrained Federated Edge Learning with Heterogeneous Data: Formulation and Analysis

Efficient collaboration between collaborative machine learning and wirel...
research
07/24/2019

Robust and Communication-Efficient Collaborative Learning

We consider a decentralized learning problem, where a set of computing n...
research
08/29/2022

DR-DSGD: A Distributionally Robust Decentralized Learning Algorithm over Graphs

In this paper, we propose to solve a regularized distributionally robust...
research
05/08/2022

Communication Compression for Decentralized Learning with Operator Splitting Methods

In decentralized learning, operator splitting methods using a primal-dua...

Please sign up or login with your details

Forgot password? Click here to reset