Digital Twin-Assisted Knowledge Distillation Framework for Heterogeneous Federated Learning

03/10/2023
by   Xiucheng Wang, et al.
0

In this paper, to deal with the heterogeneity in federated learning (FL) systems, a knowledge distillation (KD) driven training framework for FL is proposed, where each user can select its neural network model on demand and distill knowledge from a big teacher model using its own private dataset. To overcome the challenge of train the big teacher model in resource limited user devices, the digital twin (DT) is exploit in the way that the teacher model can be trained at DT located in the server with enough computing resources. Then, during model distillation, each user can update the parameters of its model at either the physical entity or the digital agent. The joint problem of model selection and training offloading and resource allocation for users is formulated as a mixed integer programming (MIP) problem. To solve the problem, Q-learning and optimization are jointly used, where Q-learning selects models for users and determines whether to train locally or on the server, and optimization is used to allocate resources for users based on the output of Q-learning. Simulation results show the proposed DT-assisted KD framework and joint optimization method can significantly improve the average accuracy of users while reducing the total delay.

READ FULL TEXT
research
10/20/2020

Asynchronous Edge Learning using Cloned Knowledge Distillation

With the increasing demand for more and more data, the federated learnin...
research
05/20/2021

Data-Free Knowledge Distillation for Heterogeneous Federated Learning

Federated Learning (FL) is a decentralized machine-learning paradigm, in...
research
09/29/2022

Label driven Knowledge Distillation for Federated Learning with non-IID Data

In real-world applications, Federated Learning (FL) meets two challenges...
research
06/17/2022

MetaFed: Federated Learning among Federations with Cyclic Knowledge Distillation for Personalized Healthcare

Federated learning has attracted increasing attention to building models...
research
08/08/2023

ConDistFL: Conditional Distillation for Federated Learning from Partially Annotated Data

Developing a generalized segmentation model capable of simultaneously de...
research
10/04/2022

Domain Discrepancy Aware Distillation for Model Aggregation in Federated Learning

Knowledge distillation has recently become popular as a method of model ...
research
12/30/2021

An Efficient Federated Distillation Learning System for Multi-task Time Series Classification

This paper proposes an efficient federated distillation learning system ...

Please sign up or login with your details

Forgot password? Click here to reset