Exploring the Distributed Knowledge Congruence in Proxy-data-free Federated Distillation

04/14/2022
by   Zhiyuan Wu, et al.
0

Federated learning (FL) is a distributed machine learning paradigm in which the server periodically aggregates local model parameters from clients without assembling their private data. Constrained communication and personalization requirements pose severe challenges to FL. Federated distillation (FD) is proposed to simultaneously address the above two problems, which exchanges knowledge between the server and clients, supporting heterogeneous local models while significantly reducing communication overhead. However, most existing FD methods require a proxy dataset, which is often unavailable in reality. A few recent proxy-data-free FD approaches can eliminate the need for additional public data, but suffer from remarkable discrepancy among local knowledge due to model heterogeneity, leading to ambiguous representation on the server and inevitable accuracy degradation. To tackle this issue, we propose a proxy-data-free FD algorithm based on distributed knowledge congruence (FedDKC). FedDKC leverages well-designed refinement strategies to narrow local knowledge differences into an acceptable upper bound, so as to mitigate the negative effects of knowledge incongruence. Specifically, from perspectives of peak probability and Shannon entropy of local knowledge, we design kernel-based knowledge refinement (KKR) and searching-based knowledge refinement (SKR) respectively, and theoretically guarantee that the refined-local knowledge can satisfy an approximately-similar distribution and be regarded as congruent. Extensive experiments conducted on three common datasets demonstrate that our proposed FedDKC significantly outperforms the state-of-the-art (accuracy boosts in 93.33 Top-1 accuracy boosts by up to 4.38 10.31 convergence speed.

READ FULL TEXT

page 1

page 18

research
05/20/2021

Data-Free Knowledge Distillation for Heterogeneous Federated Learning

Federated Learning (FL) is a decentralized machine-learning paradigm, in...
research
04/26/2022

One-shot Federated Learning without Server-side Training

Federated Learning (FL) has recently made significant progress as a new ...
research
07/25/2023

FedMEKT: Distillation-based Embedding Knowledge Transfer for Multimodal Federated Learning

Federated learning (FL) enables a decentralized machine learning paradig...
research
01/01/2023

FedICT: Federated Multi-task Distillation for Multi-access Edge Computing

The growing interest in intelligent services and privacy protection for ...
research
04/27/2022

Heterogeneous Ensemble Knowledge Transfer for Training Large Models in Federated Learning

Federated learning (FL) enables edge-devices to collaboratively learn a ...
research
04/12/2023

FedTrip: A Resource-Efficient Federated Learning Method with Triplet Regularization

In the federated learning scenario, geographically distributed clients c...
research
06/19/2023

FSAR: Federated Skeleton-based Action Recognition with Adaptive Topology Structure and Knowledge Distillation

Existing skeleton-based action recognition methods typically follow a ce...

Please sign up or login with your details

Forgot password? Click here to reset