Distributed Additive Encryption and Quantization for Privacy Preserving Federated Deep Learning

11/25/2020
by   Hangyu Zhu, et al.
0

Homomorphic encryption is a very useful gradient protection technique used in privacy preserving federated learning. However, existing encrypted federated learning systems need a trusted third party to generate and distribute key pairs to connected participants, making them unsuited for federated learning and vulnerable to security risks. Moreover, encrypting all model parameters is computationally intensive, especially for large machine learning models such as deep neural networks. In order to mitigate these issues, we develop a practical, computationally efficient encryption based protocol for federated deep learning, where the key pairs are collaboratively generated without the help of a third party. By quantization of the model parameters on the clients and an approximated aggregation on the server, the proposed method avoids encryption and decryption of the entire model. In addition, a threshold based secret sharing technique is designed so that no one can hold the global private key for decryption, while aggregated ciphertexts can be successfully decrypted by a threshold number of clients even if some clients are offline. Our experimental results confirm that the proposed method significantly reduces the communication costs and computational complexity compared to existing encrypted federated learning without compromising the performance and security.

READ FULL TEXT

page 6

page 7

page 8

page 9

page 11

page 15

page 18

page 20

research
12/21/2022

Secure Aggregation of Semi-Honest Clients and Servers in Federated Learning with Secret-Shared Homomorphism

Privacy-preserving distributed machine learning has been recognized as o...
research
12/22/2022

CHEM: Efficient Secure Aggregation with Cached Homomorphic Encryption in Federated Machine Learning Systems

Although homomorphic encryption can be incorporated into neural network ...
research
07/27/2021

Towards Industrial Private AI: A two-tier framework for data and model security

With the advances in 5G and IoT devices, the industries are vastly adopt...
research
06/15/2023

An Efficient and Multi-private Key Secure Aggregation for Federated Learning

With the emergence of privacy leaks in federated learning, secure aggreg...
research
12/01/2020

MYSTIKO : : Cloud-Mediated, Private, Federated Gradient Descent

Federated learning enables multiple, distributed participants (potential...
research
06/17/2022

FedNew: A Communication-Efficient and Privacy-Preserving Newton-Type Method for Federated Learning

Newton-type methods are popular in federated learning due to their fast ...
research
07/21/2020

FPGA-Based Hardware Accelerator of Homomorphic Encryption for Efficient Federated Learning

With the increasing awareness of privacy protection and data fragmentati...

Please sign up or login with your details

Forgot password? Click here to reset