FedREP: A Byzantine-Robust, Communication-Efficient and Privacy-Preserving Framework for Federated Learning

03/09/2023
by   Yi-Rui Yang, et al.
0

Federated learning (FL) has recently become a hot research topic, in which Byzantine robustness, communication efficiency and privacy preservation are three important aspects. However, the tension among these three aspects makes it hard to simultaneously take all of them into account. In view of this challenge, we theoretically analyze the conditions that a communication compression method should satisfy to be compatible with existing Byzantine-robust methods and privacy-preserving methods. Motivated by the analysis results, we propose a novel communication compression method called consensus sparsification (ConSpar). To the best of our knowledge, ConSpar is the first communication compression method that is designed to be compatible with both Byzantine-robust methods and privacy-preserving methods. Based on ConSpar, we further propose a novel FL framework called FedREP, which is Byzantine-robust, communication-efficient and privacy-preserving. We theoretically prove the Byzantine robustness and the convergence of FedREP. Empirical results show that FedREP can significantly outperform communication-efficient privacy-preserving baselines. Furthermore, compared with Byzantine-robust communication-efficient baselines, FedREP can achieve comparable accuracy with the extra advantage of privacy preservation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/01/2023

CRS-FL: Conditional Random Sampling for Communication-Efficient and Privacy-Preserving Federated Learning

Federated Learning (FL), a privacy-oriented distributed ML paradigm, is ...
research
09/17/2020

Robust Aggregation for Adaptive Privacy Preserving Federated Learning in Healthcare

Federated learning (FL) has enabled training models collaboratively from...
research
04/16/2021

FedCom: A Byzantine-Robust Local Model Aggregation Rule Using Data Commitment for Federated Learning

Federated learning (FL) is a promising privacy-preserving distributed ma...
research
05/11/2021

DP-SIGNSGD: When Efficiency Meets Privacy and Robustness

Federated learning (FL) has emerged as a promising collaboration paradig...
research
02/27/2023

Communication-efficient Federated Learning with Single-Step Synthetic Features Compressor for Faster Convergence

Reducing communication overhead in federated learning (FL) is challengin...
research
09/17/2023

Privacy-Preserving Polynomial Computing Over Distributed Data

In this letter, we delve into a scenario where a user aims to compute po...
research
05/19/2021

Federated Singular Vector Decomposition

With the promulgation of data protection laws (e.g., GDPR in 2018), priv...

Please sign up or login with your details

Forgot password? Click here to reset