Federated Continual Learning with Adaptive Parameter Communication

03/06/2020
by   Jaehong Yoon, et al.
1

There has been a surge of interest in continual learning and federated learning, both of which are important in training deep neural networks in real-world scenarios. Yet little research has been done regarding the scenario where each client learns on a sequence of tasks from private local data. This problem of federated continual learning poses new challenges to continual learning, such as utilizing knowledge and preventing interference from tasks learned on other clients. To resolve these issues, we propose a novel federated continual learning framework, Federated continual learning with Adaptive Parameter Communication, which additively decomposes the network weights into global shared parameters and sparse task-specific parameters. This decomposition allows to minimize interference between incompatible tasks, and also allows inter-client knowledge transfer by communicating the sparse task-specific parameters. Our federated continual learning framework is also communication-efficient, due to high sparsity of the parameters and sparse parameter update. We validate APC against existing federated learning and local continual learning methods under varying degrees of task similarity across clients, and show that our model significantly outperforms them with a large reduction in the communication cost.

READ FULL TEXT
research
03/24/2022

Addressing Client Drift in Federated Continual Learning with Adaptive Optimization

Federated learning has been extensively studied and is the prevalent met...
research
10/12/2022

Federated Continual Learning for Text Classification via Selective Inter-client Transfer

In this work, we combine the two paradigms: Federated Learning (FL) and ...
research
12/04/2022

FedKNOW: Federated Continual Learning with Signature Task Knowledge Integration at Edge

Deep Neural Networks (DNNs) have been ubiquitously adopted in internet o...
research
02/25/2019

ORACLE: Order Robust Adaptive Continual LEarning

The order of the tasks a continual learning model encounters may have la...
research
02/25/2023

Better Generative Replay for Continual Federated Learning

Federated learning is a technique that enables a centralized server to l...
research
06/06/2023

Masked Autoencoders are Efficient Continual Federated Learners

Machine learning is typically framed from a perspective of i.i.d., and m...
research
03/04/2022

Continual Horizontal Federated Learning for Heterogeneous Data

Federated learning is a promising machine learning technique that enable...

Please sign up or login with your details

Forgot password? Click here to reset