Continual Learning of a Mixed Sequence of Similar and Dissimilar Tasks

12/18/2021
by   Zixuan Ke, et al.
5

Existing research on continual learning of a sequence of tasks focused on dealing with catastrophic forgetting, where the tasks are assumed to be dissimilar and have little shared knowledge. Some work has also been done to transfer previously learned knowledge to the new task when the tasks are similar and have shared knowledge. To the best of our knowledge, no technique has been proposed to learn a sequence of mixed similar and dissimilar tasks that can deal with forgetting and also transfer knowledge forward and backward. This paper proposes such a technique to learn both types of tasks in the same network. For dissimilar tasks, the algorithm focuses on dealing with forgetting, and for similar tasks, the algorithm focuses on selectively transferring the knowledge learned from some similar previous tasks to improve the new task learning. Additionally, the algorithm automatically detects whether a new task is similar to any previous tasks. Empirical evaluation using sequences of mixed tasks demonstrates the effectiveness of the proposed model.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/11/2021

Addressing the Stability-Plasticity Dilemma via Knowledge-Aware Continual Learning

Continual learning agents should incrementally learn a sequence of tasks...
research
06/26/2023

Parameter-Level Soft-Masking for Continual Learning

Existing research on task incremental learning in continual learning has...
research
10/11/2022

Toward Sustainable Continual Learning: Detection and Knowledge Repurposing of Similar Tasks

Most existing works on continual learning (CL) focus on overcoming the c...
research
03/20/2022

Continual Sequence Generation with Adaptive Compositional Modules

Continual learning is essential for real-world deployment when there is ...
research
03/16/2022

Continuous Detection, Rapidly React: Unseen Rumors Detection based on Continual Prompt-Tuning

Since open social platforms allow for a large and continuous flow of unv...
research
11/23/2022

Continual Learning of Natural Language Processing Tasks: A Survey

Continual learning (CL) is an emerging learning paradigm that aims to em...
research
07/14/2020

Lifelong Learning using Eigentasks: Task Separation, Skill Acquisition, and Selective Transfer

We introduce the eigentask framework for lifelong learning. An eigentask...

Please sign up or login with your details

Forgot password? Click here to reset