Asynchronous Edge Learning using Cloned Knowledge Distillation

10/20/2020
by   Sang-ho Lee, et al.
0

With the increasing demand for more and more data, the federated learning (FL) methods, which try to utilize highly distributed on-device local data in the training process, have been proposed.However, fledgling services provided by startup companies not only have limited number of clients, but also have minimal resources for constant communications between the server and multiple clients. In addition, in a real-world environment where the user pool changes dynamically, the FL system must be able to efficiently utilize rapid inflow and outflow of users, while at the same time experience minimal bottleneck due to network delays of multiple users. In this respect, we amend the federated learning scenario to a more flexible asynchronous edge learning. To solve the aforementioned learning problems, we propose an asynchronous model-based communication method with knowledge distillation. In particular, we dub our knowledge distillation scheme as "cloned distillation" and explain how it is different from other knowledge distillation method. In brief, we found that in knowledge distillation between the teacher and the student there exist two contesting traits in the student: to attend to the teacher's knowledge or to retain its own knowledge exclusive to the teacher. And in this edge learning scenario, the attending property should be amplified rather than the retaining property, because teachers are dispatched to the users to learn from them and recollected at the server to teach the core model. Our asynchronous edge learning method can elastically handle the dynamic inflow and outflow of users in a service with minimal communication cost, operate with essentially no bottleneck due to user delay, and protect user's privacy. Also we found that it is robust to users who behave abnormally or maliciously.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/30/2021

FedKD: Communication Efficient Federated Learning via Knowledge Distillation

Federated learning is widely used to learn intelligent models from decen...
research
03/10/2023

Digital Twin-Assisted Knowledge Distillation Framework for Heterogeneous Federated Learning

In this paper, to deal with the heterogeneity in federated learning (FL)...
research
12/30/2021

An Efficient Federated Distillation Learning System for Multi-task Time Series Classification

This paper proposes an efficient federated distillation learning system ...
research
04/03/2021

Knowledge Distillation For Wireless Edge Learning

In this paper, we propose a framework for predicting frame errors in the...
research
07/03/2021

Pool of Experts: Realtime Querying Specialized Knowledge in Massive Neural Networks

In spite of the great success of deep learning technologies, training an...
research
01/14/2023

Survey of Knowledge Distillation in Federated Edge Learning

The increasing demand for intelligent services and privacy protection of...
research
08/07/2023

Adapter-based Selective Knowledge Distillation for Federated Multi-domain Meeting Summarization

Meeting summarization has emerged as a promising technique for providing...

Please sign up or login with your details

Forgot password? Click here to reset