Global Knowledge Distillation in Federated Learning

06/30/2021
by   Wanning Pan, et al.
0

Knowledge distillation has caught a lot of attention in Federated Learning (FL) recently. It has the advantage for FL to train on heterogeneous clients which have different data size and data structure. However, data samples across all devices are usually not independent and identically distributed (non-i.i.d), posing additional challenges to the convergence and speed of federated learning. As FL randomly asks the clients to join the training process and each client only learns from local non-i.i.d data, which makes learning processing even slower. In order to solve this problem, an intuitive idea is using the global model to guide local training. In this paper, we propose a novel global knowledge distillation method, named FedGKD, which learns the knowledge from past global models to tackle down the local bias training problem. By learning from global knowledge and consistent with current local models, FedGKD learns a global knowledge model in FL. To demonstrate the effectiveness of the proposed method, we conduct extensive experiments on various CV datasets (CIFAR-10/100) and settings (non-i.i.d data). The evaluation results show that FedGKD outperforms previous state-of-the-art methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/21/2023

The Best of Both Worlds: Accurate Global and Personalized Models through Federated Learning with Data-Free Hyper-Knowledge Distillation

Heterogeneity of data distributed across clients limits the performance ...
research
03/17/2022

Fine-tuning Global Model via Data-Free Knowledge Distillation for Non-IID Federated Learning

Federated Learning (FL) is an emerging distributed learning paradigm und...
research
02/16/2022

No One Left Behind: Inclusive Federated Learning over Heterogeneous Devices

Federated learning (FL) is an important paradigm for training global mod...
research
09/16/2023

UNIDEAL: Curriculum Knowledge Distillation Federated Learning

Federated Learning (FL) has emerged as a promising approach to enable co...
research
09/29/2022

Label driven Knowledge Distillation for Federated Learning with non-IID Data

In real-world applications, Federated Learning (FL) meets two challenges...
research
06/26/2023

Federated Learning on Non-iid Data via Local and Global Distillation

Most existing federated learning algorithms are based on the vanilla Fed...
research
12/02/2021

FedRAD: Federated Robust Adaptive Distillation

The robustness of federated learning (FL) is vital for the distributed t...

Please sign up or login with your details

Forgot password? Click here to reset