Communication-Efficient Cluster Federated Learning in Large-scale Peer-to-Peer Networks

04/08/2022
by   Yilin Zhou, et al.
0

A traditional federated learning (FL) allows clients to collaboratively train a global model under the coordination of a central server, which sparks great interests in exploiting the private data distributed on clients. However, once the central server suffers from a single point of failure, it will lead to system crash. In addition, FL usually involves a large number of clients, which requires expensive communication costs. These challenges inspire a communication-efficient design of decentralized FL. In this paper, we propose an efficient and privacy-preserving global model training protocol in the context of FL in large-scale peer-to-peer networks, CFL. The proposed CFL protocol aggregates local contributions hierarchically by a cluster-based aggregation mode, as well as a leverged authenticated encryption scheme to ensure the security communication, whose key is distributed by a modified secure communication key establishment protocol. Theoretical analyses show that CFL guarantees the privacy of local model update parameters, as well as integrity and authenticity under the widespread internal semi-honest and external malicious threat models. In particular, the proposed key revocation based on public voting can effectively defense against external adversaries hijacking honest participants to ensure the confidentiality of the communication keys. Moreover, the modified secure communication key establishment protocol indeed achieves high network connectivity probability to ensure transmission security of the system.

READ FULL TEXT

page 1

page 4

research
05/30/2021

PPT: A Privacy-Preserving Global Model Training Protocol for Federated Learning in P2P Networks

The concept of Federated Learning has emerged as a convergence of distri...
research
05/15/2023

ESAFL: Efficient Secure Additively Homomorphic Encryption for Cross-Silo Federated Learning

Cross-silo federated learning (FL) enables multiple clients to collabora...
research
05/22/2023

FSSA: Efficient 3-Round Secure Aggregation for Privacy-Preserving Federated Learning

Federated learning (FL) allows a large number of clients to collaborativ...
research
06/08/2022

Dap-FL: Federated Learning flourishes by adaptive tuning and secure aggregation

Federated learning (FL), an attractive and promising distributed machine...
research
05/16/2019

BrainTorrent: A Peer-to-Peer Environment for Decentralized Federated Learning

Access to sufficient annotated data is a common challenge in training de...
research
06/11/2023

FedDec: Peer-to-peer Aided Federated Learning

Federated learning (FL) has enabled training machine learning models exp...
research
03/23/2022

Mining Latent Relationships among Clients: Peer-to-peer Federated Learning with Adaptive Neighbor Matching

In federated learning (FL), clients may have diverse objectives, merging...

Please sign up or login with your details

Forgot password? Click here to reset