Compression Boosts Differentially Private Federated Learning

11/10/2020
by   Raouf Kerkouche, et al.
0

Federated Learning allows distributed entities to train a common model collaboratively without sharing their own data. Although it prevents data collection and aggregation by exchanging only parameter updates, it remains vulnerable to various inference and reconstruction attacks where a malicious entity can learn private information about the participants' training data from the captured gradients. Differential Privacy is used to obtain theoretically sound privacy guarantees against such inference attacks by noising the exchanged update vectors. However, the added noise is proportional to the model size which can be very large with modern neural networks. This can result in poor model quality. In this paper, compressive sensing is used to reduce the model size and hence increase model quality without sacrificing privacy. We show experimentally, using 2 datasets, that our privacy-preserving proposal can reduce the communication costs by up to 95 penalty compared to traditional non-private federated learning schemes.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/15/2020

Federated Learning in Adversarial Settings

Federated Learning enables entities to collaboratively learn a shared pr...
research
10/09/2020

Voting-based Approaches For Differentially Private Federated Learning

While federated learning (FL) enables distributed agents to collaborativ...
research
02/27/2021

Constrained Differentially Private Federated Learning for Low-bandwidth Devices

Federated learning becomes a prominent approach when different entities ...
research
06/02/2022

Impact of Sampling on Locally Differentially Private Data Collection

With the recent bloom of data, there is a huge surge in threats against ...
research
02/08/2022

Practical Challenges in Differentially-Private Federated Survival Analysis of Medical Data

Survival analysis or time-to-event analysis aims to model and predict th...
research
05/23/2022

Privacy-preserving Data Filtering in Federated Learning Using Influence Approximation

Federated Learning by nature is susceptible to low-quality, corrupted, o...
research
12/01/2020

MYSTIKO : : Cloud-Mediated, Private, Federated Gradient Descent

Federated learning enables multiple, distributed participants (potential...

Please sign up or login with your details

Forgot password? Click here to reset