Decentralized Differentially Private Without-Replacement Stochastic Gradient Descent

09/08/2018
by   Richeng Jin, et al.
0

While machine learning has achieved remarkable results in a wide variety of domains, the training of models often requires large datasets that may need to be collected from different individuals. As sensitive information may be contained in the individual's dataset, sharing training data may lead to severe privacy concerns. One effective approach to build the privacy-aware machine learning methods is to leverage the generic framework of differential privacy. Considering that stochastic gradient descent (SGD) is one of the mostly adopted methods for large-scale machine learning problems, two decentralized differentially private SGD algorithms are proposed in this work. Particularly, we focus on SGD without replacement due to its favorable structure for practical implementation. In addition, both privacy and convergence analysis are provided for the proposed algorithms. Finally, extensive experiments are performed to verify the theoretical results and demonstrate the effectiveness of the proposed algorithms.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/11/2018

Learning rate adaptation for differentially private stochastic gradient descent

Differentially private learning has recently emerged as the leading appr...
research
07/11/2019

Amplifying Rényi Differential Privacy via Shuffling

Differential privacy is a useful tool to build machine learning models w...
research
06/27/2020

Understanding Gradient Clipping in Private SGD: A Geometric Perspective

Deep learning models are increasingly popular in many machine learning a...
research
06/15/2016

Bolt-on Differential Privacy for Scalable Stochastic Gradient Descent-based Analytics

While significant progress has been made separately on analytics systems...
research
06/14/2020

Differentially Private Decentralized Learning

Decentralized learning has received great attention for its high efficie...
research
02/27/2019

Distributed Byzantine Tolerant Stochastic Gradient Descent in the Era of Big Data

The recent advances in sensor technologies and smart devices enable the ...
research
06/24/2019

The Value of Collaboration in Convex Machine Learning with Differential Privacy

In this paper, we apply machine learning to distributed private data own...

Please sign up or login with your details

Forgot password? Click here to reset