DReS-FL: Dropout-Resilient Secure Federated Learning for Non-IID Clients via Secret Data Sharing

10/06/2022
by   Jiawei Shao, et al.
5

Federated learning (FL) strives to enable collaborative training of machine learning models without centrally collecting clients' private data. Different from centralized training, the local datasets across clients in FL are non-independent and identically distributed (non-IID). In addition, the data-owning clients may drop out of the training process arbitrarily. These characteristics will significantly degrade the training performance. This paper proposes a Dropout-Resilient Secure Federated Learning (DReS-FL) framework based on Lagrange coded computing (LCC) to tackle both the non-IID and dropout problems. The key idea is to utilize Lagrange coding to secretly share the private datasets among clients so that each client receives an encoded version of the global dataset, and the local gradient computation over this dataset is unbiased. To correctly decode the gradient at the server, the gradient function has to be a polynomial in a finite field, and thus we construct polynomial integer neural networks (PINNs) to enable our framework. Theoretical analysis shows that DReS-FL is resilient to client dropouts and provides privacy protection for the local datasets. Furthermore, we experimentally demonstrate that DReS-FL consistently leads to significant performance gains over baseline methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/09/2023

Federated Coded Matrix Inversion

Federated learning (FL) is a decentralized model for training data distr...
research
05/26/2022

Friends to Help: Saving Federated Learning from Client Dropout

Federated learning (FL) is an outstanding distributed machine learning f...
research
05/31/2022

Secure Federated Clustering

We consider a foundational unsupervised learning task of k-means data cl...
research
01/28/2022

A Secure and Efficient Federated Learning Framework for NLP

In this work, we consider the problem of designing secure and efficient ...
research
01/26/2022

Fast Server Learning Rate Tuning for Coded Federated Dropout

In cross-device Federated Learning (FL), clients with low computational ...
research
09/13/2023

Tackling the Non-IID Issue in Heterogeneous Federated Learning by Gradient Harmonization

Federated learning (FL) is a privacy-preserving paradigm for collaborati...
research
10/21/2021

Guess what? You can boost Federated Learning for free

Federated Learning (FL) exploits the computation power of edge devices, ...

Please sign up or login with your details

Forgot password? Click here to reset