AnoFel: Supporting Anonymity for Privacy-Preserving Federated Learning

06/12/2023
by   Ghada Almashaqbeh, et al.
0

Federated learning enables users to collaboratively train a machine learning model over their private datasets. Secure aggregation protocols are employed to mitigate information leakage about the local datasets. This setup, however, still leaks the participation of a user in a training iteration, which can also be sensitive. Protecting user anonymity is even more challenging in dynamic environments where users may (re)join or leave the training process at any point of time. In this paper, we introduce AnoFel, the first framework to support private and anonymous dynamic participation in federated learning. AnoFel leverages several cryptographic primitives, the concept of anonymity sets, differential privacy, and a public bulletin board to support anonymous user registration, as well as unlinkable and confidential model updates submission. Additionally, our system allows dynamic participation, where users can join or leave at any time, without needing any recovery protocol or interaction. To assess security, we formalize a notion for privacy and anonymity in federated learning, and formally prove that AnoFel satisfies this notion. To the best of our knowledge, our system is the first solution with provable anonymity guarantees. To assess efficiency, we provide a concrete implementation of AnoFel, and conduct experiments showing its ability to support learning applications scaling to a large number of clients. For an MNIST classification task with 512 clients, the client setup takes less than 3 sec, and a training iteration can be finished in 3.2 sec. We also compare our system with prior work and demonstrate its practicality for contemporary learning tasks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/19/2022

Scotch: An Efficient Secure Computation Framework for Secure Aggregation

Federated learning enables multiple data owners to jointly train a machi...
research
02/19/2020

PrivacyFL: A simulator for privacy-preserving and secure federated learning

Federated learning is a technique that enables distributed clients to co...
research
11/06/2019

Secure Federated Submodel Learning

Federated learning was proposed with an intriguing vision of achieving c...
research
10/12/2020

Differentially Private Secure Multi-Party Computation for Federated Learning in Financial Applications

Federated Learning enables a population of clients, working with a trust...
research
08/30/2022

Reducing Impacts of System Heterogeneity in Federated Learning using Weight Update Magnitudes

The widespread adoption of handheld devices have fueled rapid growth in ...
research
09/19/2023

FRAMU: Attention-based Machine Unlearning using Federated Reinforcement Learning

Machine Unlearning is an emerging field that addresses data privacy issu...

Please sign up or login with your details

Forgot password? Click here to reset