Communication-Computation Efficient Secure Aggregation for Federated Learning

12/10/2020
by   Beongjun Choi, et al.
0

Federated learning has been spotlighted as a way to train neural networks using data distributed over multiple nodes without the need for the nodes to share data. Unfortunately, it has also been shown that data privacy could not be fully guaranteed as adversaries may be able to extract certain information on local data from the model parameters transmitted during federated learning. A recent solution based on the secure aggregation primitive enabled privacy-preserving federated learning, but at the expense of significant extra communication/computational resources. In this paper, we propose communication-computation efficient secure aggregation which substantially reduces the amount of communication/computational resources relative to the existing secure solution without sacrificing data privacy. The key idea behind the suggested scheme is to design the topology of the secret-sharing nodes as sparse random graphs instead of the complete graph corresponding to the existing solution. We first obtain the necessary and sufficient condition on the graph to guarantee reliable and private federated learning in the information-theoretic sense. We then suggest using the Erdős-Rényi graph in particular and provide theoretical guarantees on the reliability/privacy of the proposed scheme. Through extensive real-world experiments, we demonstrate that our scheme, using only 20 ∼ 30% of the resources required in the conventional scheme, maintains virtually the same levels of reliability and data privacy in practical federated learning systems.

READ FULL TEXT
research
09/23/2020

FastSecAgg: Scalable Secure Aggregation for Privacy-Preserving Federated Learning

Recent attacks on federated learning demonstrate that keeping the traini...
research
02/20/2023

Byzantine-Resistant Secure Aggregation for Federated Learning Based on Coded Computing and Vector Commitment

In this paper, we propose an efficient secure aggregation scheme for fed...
research
07/09/2022

Federated Learning with Quantum Secure Aggregation

This article illustrates a novel Quantum Secure Aggregation (QSA) scheme...
research
07/20/2022

Multigraph Topology Design for Cross-Silo Federated Learning

Cross-silo federated learning utilizes a few hundred reliable data silos...
research
06/29/2021

Bottleneck Time Minimization for Distributed Iterative Processes: Speeding Up Gossip-Based Federated Learning on Networked Computers

We present a novel task scheduling scheme for accelerating computational...
research
10/01/2022

Privacy-preserving Decentralized Federated Learning over Time-varying Communication Graph

Establishing how a set of learners can provide privacy-preserving federa...
research
08/17/2020

Information-Theoretic Privacy in Federated Submodel learning

We consider information-theoretic privacy in federated submodel learning...

Please sign up or login with your details

Forgot password? Click here to reset