Hierarchical Federated Learning with Privacy

06/10/2022
by   Varun Chandrasekaran, et al.
9

Federated learning (FL), where data remains at the federated clients, and where only gradient updates are shared with a central aggregator, was assumed to be private. Recent work demonstrates that adversaries with gradient-level access can mount successful inference and reconstruction attacks. In such settings, differentially private (DP) learning is known to provide resilience. However, approaches used in the status quo (central and local DP) introduce disparate utility vs. privacy trade-offs. In this work, we take the first step towards mitigating such trade-offs through hierarchical FL (HFL). We demonstrate that by the introduction of a new intermediary level where calibrated DP noise can be added, better privacy vs. utility trade-offs can be obtained; we term this hierarchical DP (HDP). Our experiments with 3 different datasets (commonly used as benchmarks for FL) suggest that HDP produces models as accurate as those obtained using central DP, where noise is added at a central aggregator. Such an approach also provides comparable benefit against inference adversaries as in the local DP case, where noise is added at the federated clients.

READ FULL TEXT
research
09/08/2020

Toward Robustness and Privacy in Federated Learning: Experimenting with Local and Central Differential Privacy

Federated Learning (FL) allows multiple participants to collaboratively ...
research
11/29/2022

Adap DP-FL: Differentially Private Federated Learning with Adaptive Noise

Federated learning seeks to address the issue of isolated data islands b...
research
07/13/2020

Privacy Amplification via Random Check-Ins

Differentially Private Stochastic Gradient Descent (DP-SGD) forms a fund...
research
01/16/2022

Visualizing Privacy-Utility Trade-Offs in Differentially Private Data Releases

Organizations often collect private data and release aggregate statistic...
research
02/09/2021

Federated Learning with Local Differential Privacy: Trade-offs between Privacy, Utility, and Communication

Federated learning (FL) allows to train a massive amount of data private...
research
03/20/2023

Make Landscape Flatter in Differentially Private Federated Learning

To defend the inference attacks and mitigate the sensitive information l...
research
04/14/2020

Differentially Private AirComp Federated Learning with Power Adaptation Harnessing Receiver Noise

Over-the-air computation (AirComp)-based federated learning (FL) enables...

Please sign up or login with your details

Forgot password? Click here to reset