Differential Privacy and Byzantine Resilience in SGD: Do They Add Up?

02/16/2021
by   Rachid Guerraoui, et al.
0

This paper addresses the problem of combining Byzantine resilience with privacy in machine learning (ML). Specifically, we study whether a distributed implementation of the renowned Stochastic Gradient Descent (SGD) learning algorithm is feasible with both differential privacy (DP) and Byzantine resilience. To the best of our knowledge, this is the first work to tackle this problem from a theoretical point of view. Intuitively, it should be straightforward to merge standard solutions for these two (seemingly) orthogonal issues. However, a key finding of our analyses is that classical approaches to Byzantine resilience and DP in ML are incompatible. More precisely, we show that a direct composition of these techniques makes the guarantees of the resulting SGD algorithm depend unfavourably upon the number of parameters in the ML model, making the training of large models practically infeasible. We validate our theoretical results through numerical experiments on publicly-available datasets; showing that it is impractical to simultaneously ensure DP and Byzantine resilience even for reasonable model sizes.

READ FULL TEXT

page 1

page 2

page 3

page 4

10/08/2021

Combining Differential Privacy and Byzantine Resilience in Distributed SGD

Privacy and Byzantine resilience (BR) are two crucial requirements of mo...
04/29/2022

Bridging Differential Privacy and Byzantine-Robustness via Model Aggregation

This paper aims at jointly addressing two seemly conflicting issues in f...
10/07/2021

Complex-valued deep learning with differential privacy

We present ζ-DP, an extension of differential privacy (DP) to complex-va...
10/12/2020

Garfield: System Support for Byzantine Machine Learning

Byzantine Machine Learning (ML) systems are nowadays vulnerable for they...
01/28/2022

Differential Privacy Guarantees for Stochastic Gradient Langevin Dynamics

We analyse the privacy leakage of noisy stochastic gradient descent by m...
09/22/2022

Making Byzantine Decentralized Learning Efficient

Decentralized-SGD (D-SGD) distributes heavy learning tasks across multip...
05/31/2022

Dropbear: Machine Learning Marketplaces made Trustworthy with Byzantine Model Agreement

Marketplaces for machine learning (ML) models are emerging as a way for ...