No One Left Behind: Inclusive Federated Learning over Heterogeneous Devices

02/16/2022
by   Ruixuan Liu, et al.
0

Federated learning (FL) is an important paradigm for training global models from decentralized data in a privacy-preserving way. Existing FL methods usually assume the global model can be trained on any participating client. However, in real applications, the devices of clients are usually heterogeneous, and have different computing power. Although big models like BERT have achieved huge success in AI, it is difficult to apply them to heterogeneous FL with weak clients. The straightforward solutions like removing the weak clients or using a small model to fit all clients will lead to some problems, such as under-representation of dropped clients and inferior accuracy due to data loss or limited model representation ability. In this work, we propose InclusiveFL, a client-inclusive federated learning method to handle this problem. The core idea of InclusiveFL is to assign models of different sizes to clients with different computing capabilities, bigger models for powerful clients and smaller ones for weak clients. We also propose an effective method to share the knowledge among multiple local models with different sizes. In this way, all the clients can participate in the model learning in FL, and the final model can be big and powerful enough. Besides, we propose a momentum knowledge distillation method to better transfer knowledge in big models on powerful clients to the small models on weak clients. Extensive experiments on many real-world benchmark datasets demonstrate the effectiveness of the proposed method in learning accurate models from clients with heterogeneous devices under the FL framework.

READ FULL TEXT

page 4

page 8

research
06/30/2021

Global Knowledge Distillation in Federated Learning

Knowledge distillation has caught a lot of attention in Federated Learni...
research
05/30/2022

FRAug: Tackling Federated Learning with Non-IID Features via Representation Augmentation

Federated Learning (FL) is a decentralized learning paradigm in which mu...
research
05/26/2022

Federated Split BERT for Heterogeneous Text Classification

Pre-trained BERT models have achieved impressive performance in many nat...
research
05/16/2023

Keep It Simple: Fault Tolerance Evaluation of Federated Learning with Unreliable Clients

Federated learning (FL), as an emerging artificial intelligence (AI) app...
research
01/20/2022

Federated Learning with Heterogeneous Architectures using Graph HyperNetworks

Standard Federated Learning (FL) techniques are limited to clients with ...
research
12/02/2021

FedRAD: Federated Robust Adaptive Distillation

The robustness of federated learning (FL) is vital for the distributed t...
research
07/25/2023

FedMEKT: Distillation-based Embedding Knowledge Transfer for Multimodal Federated Learning

Federated learning (FL) enables a decentralized machine learning paradig...

Please sign up or login with your details

Forgot password? Click here to reset