FedICT: Federated Multi-task Distillation for Multi-access Edge Computing

01/01/2023
by   Zhiyuan Wu, et al.
0

The growing interest in intelligent services and privacy protection for mobile devices has given rise to the widespread application of federated learning in Multi-access Edge Computing (MEC). Diverse user behaviors call for personalized services with heterogeneous Machine Learning (ML) models on different devices. Federated Multi-task Learning (FMTL) is proposed to train related but personalized ML models for different devices, whereas previous works suffer from excessive communication overhead during training and neglect the model heterogeneity among devices in MEC. Introducing knowledge distillation into FMTL can simultaneously enable efficient communication and model heterogeneity among clients, whereas existing methods rely on a public dataset, which is impractical in reality. To tackle this dilemma, Federated MultI-task Distillation for Multi-access Edge CompuTing (FedICT) is proposed. FedICT direct local-global knowledge aloof during bi-directional distillation processes between clients and the server, aiming to enable multi-task clients while alleviating client drift derived from divergent optimization directions of client-side local models. Specifically, FedICT includes Federated Prior Knowledge Distillation (FPKD) and Local Knowledge Adjustment (LKA). FPKD is proposed to reinforce the clients' fitting of local data by introducing prior knowledge of local data distributions. Moreover, LKA is proposed to correct the distillation loss of the server, making the transferred local knowledge better match the generalized representation. Experiments on three datasets show that FedICT significantly outperforms all compared benchmarks in various data heterogeneous and model architecture settings, achieving improved accuracy with less than 1.2 than 75

READ FULL TEXT
research
01/21/2023

The Best of Both Worlds: Accurate Global and Personalized Models through Federated Learning with Data-Free Hyper-Knowledge Distillation

Heterogeneity of data distributed across clients limits the performance ...
research
01/14/2023

Survey of Knowledge Distillation in Federated Edge Learning

The increasing demand for intelligent services and privacy protection of...
research
07/29/2021

QuPeD: Quantized Personalization via Distillation with Applications to Federated Learning

Traditionally, federated learning (FL) aims to train a single global mod...
research
01/10/2022

FedDTG:Federated Data-Free Knowledge Distillation via Three-Player Generative Adversarial Networks

Applying knowledge distillation to personalized cross-silo federated lea...
research
04/14/2022

Exploring the Distributed Knowledge Congruence in Proxy-data-free Federated Distillation

Federated learning (FL) is a distributed machine learning paradigm in wh...
research
10/15/2021

Nothing Wasted: Full Contribution Enforcement in Federated Edge Learning

The explosive amount of data generated at the network edge makes mobile ...
research
03/10/2023

FedACK: Federated Adversarial Contrastive Knowledge Distillation for Cross-Lingual and Cross-Model Social Bot Detection

Social bot detection is of paramount importance to the resilience and se...

Please sign up or login with your details

Forgot password? Click here to reset