Label driven Knowledge Distillation for Federated Learning with non-IID Data

09/29/2022
by   Minh-Duong Nguyen, et al.
0

In real-world applications, Federated Learning (FL) meets two challenges: (1) scalability, especially when applied to massive IoT networks; and (2) how to be robust against an environment with heterogeneous data. Realizing the first problem, we aim to design a novel FL framework named Full-stack FL (F2L). More specifically, F2L utilizes a hierarchical network architecture, making extending the FL network accessible without reconstructing the whole network system. Moreover, leveraging the advantages of hierarchical network design, we propose a new label-driven knowledge distillation (LKD) technique at the global server to address the second problem. As opposed to current knowledge distillation techniques, LKD is capable of training a student model, which consists of good knowledge from all teachers' models. Therefore, our proposed algorithm can effectively extract the knowledge of the regions' data distribution (i.e., the regional aggregated models) to reduce the divergence between clients' models when operating under the FL system with non-independent identically distributed data. Extensive experiment results reveal that: (i) our F2L method can significantly improve the overall FL efficiency in all global distillations, and (ii) F2L rapidly achieves convergence as global distillation stages occur instead of increasing on each communication cycle.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/30/2021

Global Knowledge Distillation in Federated Learning

Knowledge distillation has caught a lot of attention in Federated Learni...
research
07/31/2023

Federated Learning for Data and Model Heterogeneity in Medical Imaging

Federated Learning (FL) is an evolving machine learning method in which ...
research
02/23/2023

Personalized Decentralized Federated Learning with Knowledge Distillation

Personalization in federated learning (FL) functions as a coordinator fo...
research
09/16/2023

UNIDEAL: Curriculum Knowledge Distillation Federated Learning

Federated Learning (FL) has emerged as a promising approach to enable co...
research
12/02/2021

FedRAD: Federated Robust Adaptive Distillation

The robustness of federated learning (FL) is vital for the distributed t...
research
08/08/2023

ConDistFL: Conditional Distillation for Federated Learning from Partially Annotated Data

Developing a generalized segmentation model capable of simultaneously de...
research
03/10/2023

Digital Twin-Assisted Knowledge Distillation Framework for Heterogeneous Federated Learning

In this paper, to deal with the heterogeneity in federated learning (FL)...

Please sign up or login with your details

Forgot password? Click here to reset