FedKNOW: Federated Continual Learning with Signature Task Knowledge Integration at Edge

12/04/2022
by   Yaxin Luopan, et al.
0

Deep Neural Networks (DNNs) have been ubiquitously adopted in internet of things and are becoming an integral of our daily life. When tackling the evolving learning tasks in real world, such as classifying different types of objects, DNNs face the challenge to continually retrain themselves according to the tasks on different edge devices. Federated continual learning is a promising technique that offers partial solutions but yet to overcome the following difficulties: the significant accuracy loss due to the limited on-device processing, the negative knowledge transfer caused by the limited communication of non-IID data, and the limited scalability on the tasks and edge devices. In this paper, we propose FedKNOW, an accurate and scalable federated continual learning framework, via a novel concept of signature task knowledge. FedKNOW is a client side solution that continuously extracts and integrates the knowledge of signature tasks which are highly influenced by the current task. Each client of FedKNOW is composed of a knowledge extractor, a gradient restorer and, most importantly, a gradient integrator. Upon training for a new task, the gradient integrator ensures the prevention of catastrophic forgetting and mitigation of negative knowledge transfer by effectively combining signature tasks identified from the past local tasks and other clients' current tasks through the global model. We implement FedKNOW in PyTorch and extensively evaluate it against state-of-the-art techniques using popular federated continual learning benchmarks. Extensive evaluation results on heterogeneous edge devices show that FedKNOW improves model accuracy by 63.24 34.28 numbers of tasks or clients, and training different complex networks.

READ FULL TEXT
research
03/06/2020

Federated Continual Learning with Adaptive Parameter Communication

There has been a surge of interest in continual learning and federated l...
research
03/24/2022

Addressing Client Drift in Federated Continual Learning with Adaptive Optimization

Federated learning has been extensively studied and is the prevalent met...
research
06/12/2020

Collaborative and continual learning for classification tasks in a society of devices

Today we live in a context in which devices are increasingly interconnec...
research
09/01/2021

Federated Reconnaissance: Efficient, Distributed, Class-Incremental Learning

We describe federated reconnaissance, a class of learning problems in wh...
research
10/12/2022

Federated Continual Learning for Text Classification via Selective Inter-client Transfer

In this work, we combine the two paradigms: Federated Learning (FL) and ...
research
07/10/2023

Fed-CPrompt: Contrastive Prompt for Rehearsal-Free Federated Continual Learning

Federated continual learning (FCL) learns incremental tasks over time fr...
research
12/16/2020

Inexact-ADMM Based Federated Meta-Learning for Fast and Continual Edge Learning

In order to meet the requirements for performance, safety, and latency i...

Please sign up or login with your details

Forgot password? Click here to reset