Domain Discrepancy Aware Distillation for Model Aggregation in Federated Learning

10/04/2022
by   Shangchao Su, et al.
0

Knowledge distillation has recently become popular as a method of model aggregation on the server for federated learning. It is generally assumed that there are abundant public unlabeled data on the server. However, in reality, there exists a domain discrepancy between the datasets of the server domain and a client domain, which limits the performance of knowledge distillation. How to improve the aggregation under such a domain discrepancy setting is still an open problem. In this paper, we first analyze the generalization bound of the aggregation model produced from knowledge distillation for the client domains, and then describe two challenges, server-to-client discrepancy and client-to-client discrepancy, brought to the aggregation model by the domain discrepancies. Following our analysis, we propose an adaptive knowledge aggregation algorithm FedD3A based on domain discrepancy aware distillation to lower the bound. FedD3A performs adaptive weighting at the sample level in each round of FL. For each sample in the server domain, only the client models of its similar domains will be selected for playing the teacher role. To achieve this, we show that the discrepancy between the server-side sample and the client domain can be approximately measured using a subspace projection matrix calculated on each client without accessing its raw data. The server can thus leverage the projection matrices from multiple clients to assign weights to the corresponding teacher models for each server-side sample. We validate FedD3A on two popular cross-domain datasets and show that it outperforms the compared competitors in both cross-silo and cross-device FL settings.

READ FULL TEXT
research
03/17/2022

Fine-tuning Global Model via Data-Free Knowledge Distillation for Non-IID Federated Learning

Federated Learning (FL) is an emerging distributed learning paradigm und...
research
06/12/2020

Ensemble Distillation for Robust Model Fusion in Federated Learning

Federated Learning (FL) is a machine learning setting where many devices...
research
02/23/2023

Personalized Decentralized Federated Learning with Knowledge Distillation

Personalization in federated learning (FL) functions as a coordinator fo...
research
09/16/2023

UNIDEAL: Curriculum Knowledge Distillation Federated Learning

Federated Learning (FL) has emerged as a promising approach to enable co...
research
11/15/2022

Cross-domain Federated Adaptive Prompt Tuning for CLIP

Federated learning (FL) allows multiple parties to collaboratively train...
research
06/19/2023

FSAR: Federated Skeleton-based Action Recognition with Adaptive Topology Structure and Knowledge Distillation

Existing skeleton-based action recognition methods typically follow a ce...
research
03/10/2023

Digital Twin-Assisted Knowledge Distillation Framework for Heterogeneous Federated Learning

In this paper, to deal with the heterogeneity in federated learning (FL)...

Please sign up or login with your details

Forgot password? Click here to reset