Considerations on the Theory of Training Models with Differential Privacy

03/08/2023
by   Marten van Dijk, et al.
0

In federated learning collaborative learning takes place by a set of clients who each want to remain in control of how their local training data is used, in particular, how can each client's local training data remain private? Differential privacy is one method to limit privacy leakage. We provide a general overview of its framework and provable properties, adopt the more recent hypothesis based definition called Gaussian DP or f-DP, and discuss Differentially Private Stochastic Gradient Descent (DP-SGD). We stay at a meta level and attempt intuitive explanations and insights in this book chapter.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/30/2021

Dynamic Differential-Privacy Preserving SGD

Differentially-Private Stochastic Gradient Descent (DP-SGD) prevents tra...
research
01/11/2022

Feature Space Hijacking Attacks against Differentially Private Split Learning

Split learning and differential privacy are technologies with growing po...
research
02/19/2023

On the f-Differential Privacy Guarantees of Discrete-Valued Mechanisms

We consider a federated data analytics problem in which a server coordin...
research
02/17/2021

Differential Private Hogwild! over Distributed Local Data Sets

We consider the Hogwild! setting where clients use local SGD iterations ...
research
08/21/2020

A(DP)^2SGD: Asynchronous Decentralized Parallel Stochastic Gradient Descent with Differential Privacy

As deep learning models are usually massive and complex, distributed lea...
research
03/28/2020

Differentially Private Federated Learning for Resource-Constrained Internet of Things

With the proliferation of smart devices having built-in sensors, Interne...
research
06/07/2022

Shuffled Check-in: Privacy Amplification towards Practical Distributed Learning

Recent studies of distributed computation with formal privacy guarantees...

Please sign up or login with your details

Forgot password? Click here to reset