MetaFed: Federated Learning among Federations with Cyclic Knowledge Distillation for Personalized Healthcare

06/17/2022
by   Yiqiang Chen, et al.
0

Federated learning has attracted increasing attention to building models without accessing the raw user data, especially in healthcare. In real applications, different federations can seldom work together due to possible reasons such as data heterogeneity and distrust/inexistence of the central server. In this paper, we propose a novel framework called MetaFed to facilitate trustworthy FL between different federations. MetaFed obtains a personalized model for each federation without a central server via the proposed Cyclic Knowledge Distillation. Specifically, MetaFed treats each federation as a meta distribution and aggregates knowledge of each federation in a cyclic manner. The training is split into two parts: common knowledge accumulation and personalization. Comprehensive experiments on three benchmarks demonstrate that MetaFed without a server achieves better accuracy compared to state-of-the-art methods (e.g., 10 baseline for PAMAP2) with fewer communication costs.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/02/2022

FedDKD: Federated Learning with Decentralized Knowledge Distillation

The performance of federated learning in neural networks is generally in...
research
05/20/2021

Data-Free Knowledge Distillation for Heterogeneous Federated Learning

Federated Learning (FL) is a decentralized machine-learning paradigm, in...
research
02/23/2023

Personalized Decentralized Federated Learning with Knowledge Distillation

Personalization in federated learning (FL) functions as a coordinator fo...
research
10/29/2022

Fast-Convergent Federated Learning via Cyclic Aggregation

Federated learning (FL) aims at optimizing a shared global model over mu...
research
03/10/2023

Digital Twin-Assisted Knowledge Distillation Framework for Heterogeneous Federated Learning

In this paper, to deal with the heterogeneity in federated learning (FL)...
research
12/03/2020

Federated Learning with Diversified Preference for Humor Recognition

Understanding humor is critical to creative language modeling with many ...
research
11/04/2020

Federated Knowledge Distillation

Distributed learning frameworks often rely on exchanging model parameter...

Please sign up or login with your details

Forgot password? Click here to reset