LEASGD: an Efficient and Privacy-Preserving Decentralized Algorithm for Distributed Learning

11/27/2018
by   Hsin-Pai Cheng, et al.
0

Distributed learning systems have enabled training large-scale models over large amount of data in significantly shorter time. In this paper, we focus on decentralized distributed deep learning systems and aim to achieve differential privacy with good convergence rate and low communication cost. To achieve this goal, we propose a new learning algorithm LEASGD (Leader-Follower Elastic Averaging Stochastic Gradient Descent), which is driven by a novel Leader-Follower topology and a differential privacy model.We provide a theoretical analysis of the convergence rate and the trade-off between the performance and privacy in the private setting.The experimental results show that LEASGD outperforms state-of-the-art decentralized learning algorithm DPSGD by achieving steadily lower loss within the same iterations and by reducing the communication cost by 30 budget and has higher final accuracy result than DPSGD under private setting.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/16/2020

A Better Bound Gives a Hundred Rounds: Enhanced Privacy Guarantees via f-Divergences

We derive the optimal differential privacy (DP) parameters of a mechanis...
research
07/11/2022

Privacy-preserving Decentralized Deep Learning with Multiparty Homomorphic Encryption

Decentralized deep learning plays a key role in collaborative model trai...
research
09/03/2020

Private Weighted Random Walk Stochastic Gradient Descent

We consider a decentralized learning setting in which data is distribute...
research
10/17/2018

Distributed Learning over Unreliable Networks

Most of today's distributed machine learning systems assume reliable ne...
research
05/27/2018

cpSGD: Communication-efficient and differentially-private distributed SGD

Distributed stochastic gradient descent is an important subroutine in di...
research
06/14/2020

Differentially Private Decentralized Learning

Decentralized learning has received great attention for its high efficie...
research
05/17/2023

Convergence and Privacy of Decentralized Nonconvex Optimization with Gradient Clipping and Communication Compression

Achieving communication efficiency in decentralized machine learning has...

Please sign up or login with your details

Forgot password? Click here to reset