CD^2-pFed: Cyclic Distillation-guided Channel Decoupling for Model Personalization in Federated Learning

04/08/2022
by   Yiqing Shen, et al.
4

Federated learning (FL) is a distributed learning paradigm that enables multiple clients to collaboratively learn a shared global model. Despite the recent progress, it remains challenging to deal with heterogeneous data clients, as the discrepant data distributions usually prevent the global model from delivering good generalization ability on each participating client. In this paper, we propose CD^2-pFed, a novel Cyclic Distillation-guided Channel Decoupling framework, to personalize the global model in FL, under various settings of data heterogeneity. Different from previous works which establish layer-wise personalization to overcome the non-IID data across different clients, we make the first attempt at channel-wise assignment for model personalization, referred to as channel decoupling. To further facilitate the collaboration between private and shared weights, we propose a novel cyclic distillation scheme to impose a consistent regularization between the local and global model representations during the federation. Guided by the cyclical distillation, our channel decoupling framework can deliver more accurate and generalized results for different kinds of heterogeneity, such as feature skew, label distribution skew, and concept shift. Comprehensive experiments on four benchmarks, including natural image and medical image analysis tasks, demonstrate the consistent effectiveness of our method on both local and external validations.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/31/2023

Federated Learning on Heterogeneous Data via Adaptive Self-Distillation

Federated Learning (FL) is a machine learning paradigm that enables clie...
research
03/04/2023

Federated Virtual Learning on Heterogeneous Data with Local-global Distillation

Despite Federated Learning (FL)'s trend for learning machine learning mo...
research
07/29/2021

QuPeD: Quantized Personalization via Distillation with Applications to Federated Learning

Traditionally, federated learning (FL) aims to train a single global mod...
research
06/06/2022

Virtual Homogeneity Learning: Defending against Data Heterogeneity in Federated Learning

In federated learning (FL), model performance typically suffers from cli...
research
06/06/2021

Preservation of the Global Knowledge by Not-True Self Knowledge Distillation in Federated Learning

In Federated Learning (FL), a strong global model is collaboratively lea...
research
08/21/2023

FedDAT: An Approach for Foundation Model Finetuning in Multi-Modal Heterogeneous Federated Learning

Recently, foundation models have exhibited remarkable advancements in mu...
research
10/28/2022

Federated Learning with Intermediate Representation Regularization

In contrast to centralized model training that involves data collection,...

Please sign up or login with your details

Forgot password? Click here to reset