Taming Client Dropout for Distributed Differential Privacy in Federated Learning

09/26/2022
by   Zhifeng Jiang, et al.
0

Federated learning (FL) is increasingly deployed among multiple clients (e.g., mobile devices) to train a shared model over decentralized data. To address the privacy concerns, FL systems need to protect the clients' data from being revealed during training and also control the data leakage through trained models when exposed to untrusted domains. Distributed differential privacy (DP) offers an appealing solution in this regard as it achieves an informed tradeoff between privacy and utility without a trusted server. However, existing distributed DP mechanisms work impractically in real world. For instance, to handle realistic scenarios with client dropout, these existing mechanisms often make strong assumptions about client participation yet still result in either poor privacy guarantees or unsatisfactory training accuracy. We present Hyades, a distributed differentially private FL framework that is highly efficient and resilient to client dropout. First, we develop a new privacy accounting technique under the notion of Renyi DP that tightly bounds the privacy loss in the presence of dropout before client sampling in FL. This enables Hyades to set a minimum target noise level in each training round. Second, we propose a novel 'add-then-remove' masking scheme to enforce this target noise level, even though some sampled clients may still drop out in the end. Third, we design an efficient secure aggregation mechanism that optimally pipelines communication and computation for faster execution. Evaluation through large-scale cloud deployment shows that Hyades can efficiently handle client dropout in various realistic scenarios, attaining the optimal privacy-utility tradeoff and accelerating the training by up to 2.1× compared to existing solutions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/28/2021

FeO2: Federated Learning with Opt-Out Differential Privacy

Federated learning (FL) is an emerging privacy-preserving paradigm, wher...
research
08/16/2022

FedPerm: Private and Robust Federated Learning by Parameter Permutation

Federated Learning (FL) is a distributed learning paradigm that enables ...
research
12/08/2022

Skellam Mixture Mechanism: a Novel Approach to Federated Learning with Differential Privacy

Deep neural networks have strong capabilities of memorizing the underlyi...
research
04/26/2022

Federated Stochastic Primal-dual Learning with Differential Privacy

Federated learning (FL) is a new paradigm that enables many clients to j...
research
07/19/2021

Renyi Differential Privacy of the Subsampled Shuffle Model in Distributed Learning

We study privacy in a distributed learning framework, where clients coll...
research
10/19/2021

Tackling Dynamics in Federated Incremental Learning with Variational Embedding Rehearsal

Federated Learning is a fast growing area of ML where the training datas...
research
03/07/2022

The Fundamental Price of Secure Aggregation in Differentially Private Federated Learning

We consider the problem of training a d dimensional model with distribut...

Please sign up or login with your details

Forgot password? Click here to reset