Communication-Computation Efficient Secure Aggregation for Federated Learning

by   Beongjun Choi, et al.

Federated learning has been spotlighted as a way to train neural networks using data distributed over multiple nodes without the need for the nodes to share data. Unfortunately, it has also been shown that data privacy could not be fully guaranteed as adversaries may be able to extract certain information on local data from the model parameters transmitted during federated learning. A recent solution based on the secure aggregation primitive enabled privacy-preserving federated learning, but at the expense of significant extra communication/computational resources. In this paper, we propose communication-computation efficient secure aggregation which substantially reduces the amount of communication/computational resources relative to the existing secure solution without sacrificing data privacy. The key idea behind the suggested scheme is to design the topology of the secret-sharing nodes as sparse random graphs instead of the complete graph corresponding to the existing solution. We first obtain the necessary and sufficient condition on the graph to guarantee reliable and private federated learning in the information-theoretic sense. We then suggest using the Erdős-Rényi graph in particular and provide theoretical guarantees on the reliability/privacy of the proposed scheme. Through extensive real-world experiments, we demonstrate that our scheme, using only 20 ∼ 30% of the resources required in the conventional scheme, maintains virtually the same levels of reliability and data privacy in practical federated learning systems.


FastSecAgg: Scalable Secure Aggregation for Privacy-Preserving Federated Learning

Recent attacks on federated learning demonstrate that keeping the traini...

Information-Theoretic Privacy in Federated Submodel learning

We consider information-theoretic privacy in federated submodel learning...

Federated Learning with Quantum Secure Aggregation

This article illustrates a novel Quantum Secure Aggregation (QSA) scheme...

Privacy-Preserved Blockchain-Federated-Learning for Medical Image Analysis Towards Multiple Parties

To share the patient’s data in the blockchain network can help to learn ...

Cloud-based Federated Boosting for Mobile Crowdsensing

The application of federated extreme gradient boosting to mobile crowdse...

Bottleneck Time Minimization for Distributed Iterative Processes: Speeding Up Gossip-Based Federated Learning on Networked Computers

We present a novel task scheduling scheme for accelerating computational...

SAFELearning: Enable Backdoor Detectability In Federated Learning With Secure Aggregation

For model privacy, local model parameters in federated learning shall be...