The Best of Both Worlds: Accurate Global and Personalized Models through Federated Learning with Data-Free Hyper-Knowledge Distillation

01/21/2023
by   Huancheng Chen, et al.
0

Heterogeneity of data distributed across clients limits the performance of global models trained through federated learning, especially in the settings with highly imbalanced class distributions of local datasets. In recent years, personalized federated learning (pFL) has emerged as a potential solution to the challenges presented by heterogeneous data. However, existing pFL methods typically enhance performance of local models at the expense of the global model's accuracy. We propose FedHKD (Federated Hyper-Knowledge Distillation), a novel FL algorithm in which clients rely on knowledge distillation (KD) to train local models. In particular, each client extracts and sends to the server the means of local data representations and the corresponding soft predictions – information that we refer to as “hyper-knowledge". The server aggregates this information and broadcasts it to the clients in support of local training. Notably, unlike other KD-based pFL methods, FedHKD does not rely on a public dataset nor it deploys a generative model at the server. We analyze convergence of FedHKD and conduct extensive experiments on visual datasets in a variety of scenarios, demonstrating that FedHKD provides significant improvement in both personalized as well as global model performance compared to state-of-the-art FL methods designed for heterogeneous data settings.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/30/2021

Global Knowledge Distillation in Federated Learning

Knowledge distillation has caught a lot of attention in Federated Learni...
research
02/23/2023

Personalized Decentralized Federated Learning with Knowledge Distillation

Personalization in federated learning (FL) functions as a coordinator fo...
research
11/07/2022

Closing the Gap between Client and Global Model Performance in Heterogeneous Federated Learning

The heterogeneity of hardware and data is a well-known and studied probl...
research
01/01/2023

FedICT: Federated Multi-task Distillation for Multi-access Edge Computing

The growing interest in intelligent services and privacy protection for ...
research
01/10/2022

FedDTG:Federated Data-Free Knowledge Distillation via Three-Player Generative Adversarial Networks

Applying knowledge distillation to personalized cross-silo federated lea...
research
06/30/2022

Cross-domain Federated Object Detection

Detection models trained by one party (server) may face severe performan...
research
02/13/2023

PerAda: Parameter-Efficient and Generalizable Federated Learning Personalization with Guarantees

Personalized Federated Learning (pFL) has emerged as a promising solutio...

Please sign up or login with your details

Forgot password? Click here to reset