UNIDEAL: Curriculum Knowledge Distillation Federated Learning

09/16/2023
by   Yuwen Yang, et al.
0

Federated Learning (FL) has emerged as a promising approach to enable collaborative learning among multiple clients while preserving data privacy. However, cross-domain FL tasks, where clients possess data from different domains or distributions, remain a challenging problem due to the inherent heterogeneity. In this paper, we present UNIDEAL, a novel FL algorithm specifically designed to tackle the challenges of cross-domain scenarios and heterogeneous model architectures. The proposed method introduces Adjustable Teacher-Student Mutual Evaluation Curriculum Learning, which significantly enhances the effectiveness of knowledge distillation in FL settings. We conduct extensive experiments on various datasets, comparing UNIDEAL with state-of-the-art baselines. Our results demonstrate that UNIDEAL achieves superior performance in terms of both model accuracy and communication efficiency. Additionally, we provide a convergence analysis of the algorithm, showing a convergence rate of O(1/T) under non-convex conditions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/30/2021

Global Knowledge Distillation in Federated Learning

Knowledge distillation has caught a lot of attention in Federated Learni...
research
09/10/2022

Preserving Privacy in Federated Learning with Ensemble Cross-Domain Knowledge Distillation

Federated Learning (FL) is a machine learning paradigm where local nodes...
research
09/29/2022

Label driven Knowledge Distillation for Federated Learning with non-IID Data

In real-world applications, Federated Learning (FL) meets two challenges...
research
10/28/2022

Completely Heterogeneous Federated Learning

Federated learning (FL) faces three major difficulties: cross-domain, he...
research
08/21/2023

FedDAT: An Approach for Foundation Model Finetuning in Multi-Modal Heterogeneous Federated Learning

Recently, foundation models have exhibited remarkable advancements in mu...
research
10/04/2022

Domain Discrepancy Aware Distillation for Model Aggregation in Federated Learning

Knowledge distillation has recently become popular as a method of model ...
research
08/07/2023

The Prospect of Enhancing Large-Scale Heterogeneous Federated Learning with Transformers

Federated learning (FL) addresses data privacy concerns by enabling coll...

Please sign up or login with your details

Forgot password? Click here to reset