FedDKD: Federated Learning with Decentralized Knowledge Distillation

05/02/2022
by   Xinjia Li, et al.
0

The performance of federated learning in neural networks is generally influenced by the heterogeneity of the data distribution. For a well-performing global model, taking a weighted average of the local models, as done by most existing federated learning algorithms, may not guarantee consistency with local models in the space of neural network maps. In this paper, we propose a novel framework of federated learning equipped with the process of decentralized knowledge distillation (FedDKD) (i.e., without data on the server). The FedDKD introduces a module of decentralized knowledge distillation (DKD) to distill the knowledge of the local models to train the global model by approaching the neural network map average based on the metric of divergence defined in the loss function, other than only averaging parameters as done in literature. Numeric experiments on various heterogeneous datasets reveal that FedDKD outperforms the state-of-the-art methods with more efficient communication and training in a few DKD steps, especially on some extremely heterogeneous datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/19/2022

FedX: Unsupervised Federated Learning with Cross Knowledge Distillation

This paper presents FedX, an unsupervised federated learning framework. ...
research
06/17/2022

MetaFed: Federated Learning among Federations with Cyclic Knowledge Distillation for Personalized Healthcare

Federated learning has attracted increasing attention to building models...
research
04/09/2023

Homogenizing Non-IID datasets via In-Distribution Knowledge Distillation for Decentralized Learning

Decentralized learning enables serverless training of deep neural networ...
research
10/08/2019

FedMD: Heterogenous Federated Learning via Model Distillation

Federated learning enables the creation of a powerful centralized model ...
research
03/10/2023

FedACK: Federated Adversarial Contrastive Knowledge Distillation for Cross-Lingual and Cross-Model Social Bot Detection

Social bot detection is of paramount importance to the resilience and se...
research
10/28/2021

Towards Model Agnostic Federated Learning Using Knowledge Distillation

An often unquestioned assumption underlying most current federated learn...
research
05/27/2022

A Decentralized Collaborative Learning Framework Across Heterogeneous Devices for Personalized Predictive Analytics

In this paper, we propose a Similarity-based Decentralized Knowledge Dis...

Please sign up or login with your details

Forgot password? Click here to reset