FheFL: Fully Homomorphic Encryption Friendly Privacy-Preserving Federated Learning with Byzantine Users

The federated learning (FL) technique was developed to mitigate data privacy issues in the traditional machine learning paradigm. While FL ensures that a user's data always remain with the user, the gradients are shared with the centralized server to build the global model. This results in privacy leakage, where the server can infer private information from the shared gradients. To mitigate this flaw, the next-generation FL architectures proposed encryption and anonymization techniques to protect the model updates from the server. However, this approach creates other challenges, such as malicious users sharing false gradients. Since the gradients are encrypted, the server is unable to identify rogue users. To mitigate both attacks, this paper proposes a novel FL algorithm based on a fully homomorphic encryption (FHE) scheme. We develop a distributed multi-key additive homomorphic encryption scheme that supports model aggregation in FL. We also develop a novel aggregation scheme within the encrypted domain, utilizing users' non-poisoning rates, to effectively address data poisoning attacks while ensuring privacy is preserved by the proposed encryption scheme. Rigorous security, privacy, convergence, and experimental analyses have been provided to show that FheFL is novel, secure, and private, and achieves comparable accuracy at reasonable computational cost.

READ FULL TEXT

page 1

page 13

research
04/14/2021

Privacy-preserving Federated Learning based on Multi-key Homomorphic Encryption

With the advance of machine learning and the internet of things (IoT), s...
research
08/07/2021

Secure Neuroimaging Analysis using Federated Learning with Homomorphic Encryption

Federated learning (FL) enables distributed computation of machine learn...
research
08/09/2023

Communication-Efficient Search under Fully Homomorphic Encryption for Federated Machine Learning

Homomorphic encryption (HE) has found extensive utilization in federated...
research
09/11/2023

Practical Homomorphic Aggregation for Byzantine ML

Due to the large-scale availability of data, machine learning (ML) algor...
research
03/01/2022

Privacy-Friendly Flexible IoT Health Data Processing with User-Centric Access Control

This paper proposes a novel Single and Multiple user(s) data Aggregation...
research
07/19/2022

MUD-PQFed: Towards Malicious User Detection in Privacy-Preserving Quantized Federated Learning

Federated Learning (FL), a distributed machine learning paradigm, has be...
research
04/05/2022

Privacy-Preserving Federated Learning via System Immersion and Random Matrix Encryption

Federated learning (FL) has emerged as a privacy solution for collaborat...

Please sign up or login with your details

Forgot password? Click here to reset