Data-Free Knowledge Distillation for Heterogeneous Federated Learning

05/20/2021
by   Zhuangdi Zhu, et al.
0

Federated Learning (FL) is a decentralized machine-learning paradigm, in which a global server iteratively averages the model parameters of local users without accessing their data. User heterogeneity has imposed significant challenges to FL, which can incur drifted global models that are slow to converge. Knowledge Distillation has recently emerged to tackle this issue, by refining the server model using aggregated knowledge from heterogeneous users, other than directly averaging their model parameters. This approach, however, depends on a proxy dataset, making it impractical unless such a prerequisite is satisfied. Moreover, the ensemble knowledge is not fully utilized to guide local model learning, which may in turn affect the quality of the aggregated model. Inspired by the prior art, we propose a data-free knowledge distillation approach to address heterogeneous FL, where the server learns a lightweight generator to ensemble user information in a data-free manner, which is then broadcasted to users, regulating local training using the learned knowledge as an inductive bias. Empirical studies powered by theoretical implications show that, our approach facilitates FL with better generalization performance using fewer communication rounds, compared with the state-of-the-art.

READ FULL TEXT
research
03/17/2022

Fine-tuning Global Model via Data-Free Knowledge Distillation for Non-IID Federated Learning

Federated Learning (FL) is an emerging distributed learning paradigm und...
research
11/14/2022

FedCL: Federated Multi-Phase Curriculum Learning to Synchronously Correlate User Heterogeneity

Federated Learning (FL) is a new decentralized learning used for trainin...
research
04/14/2022

Exploring the Distributed Knowledge Congruence in Proxy-data-free Federated Distillation

Federated learning (FL) is a distributed machine learning paradigm in wh...
research
06/21/2023

An Efficient Virtual Data Generation Method for Reducing Communication in Federated Learning

Communication overhead is one of the major challenges in Federated Learn...
research
06/17/2022

MetaFed: Federated Learning among Federations with Cyclic Knowledge Distillation for Personalized Healthcare

Federated learning has attracted increasing attention to building models...
research
03/10/2023

Digital Twin-Assisted Knowledge Distillation Framework for Heterogeneous Federated Learning

In this paper, to deal with the heterogeneity in federated learning (FL)...
research
01/10/2022

FedDTG:Federated Data-Free Knowledge Distillation via Three-Player Generative Adversarial Networks

Applying knowledge distillation to personalized cross-silo federated lea...

Please sign up or login with your details

Forgot password? Click here to reset