Preserving Earlier Knowledge in Continual Learning with the Help of All Previous Feature Extractors

04/28/2021
by   Zhuoyun Li, et al.
7

Continual learning of new knowledge over time is one desirable capability for intelligent systems to recognize more and more classes of objects. Without or with very limited amount of old data stored, an intelligent system often catastrophically forgets previously learned old knowledge when learning new knowledge. Recently, various approaches have been proposed to alleviate the catastrophic forgetting issue. However, old knowledge learned earlier is commonly less preserved than that learned more recently. In order to reduce the forgetting of particularly earlier learned old knowledge and improve the overall continual learning performance, we propose a simple yet effective fusion mechanism by including all the previously learned feature extractors into the intelligent model. In addition, a new feature extractor is included to the model when learning a new set of classes each time, and a feature extractor pruning is also applied to prevent the whole model size from growing rapidly. Experiments on multiple classification tasks show that the proposed approach can effectively reduce the forgetting of old knowledge, achieving state-of-the-art continual learning performance.

READ FULL TEXT
research
08/11/2021

Discriminative Distillation to Reduce Class Confusion in Continual Learning

Successful continual learning of new knowledge would enable intelligent ...
research
03/24/2023

Leveraging Old Knowledge to Continually Learn New Classes in Medical Images

Class-incremental continual learning is a core step towards developing a...
research
05/29/2023

DeCoR: Defy Knowledge Forgetting by Predicting Earlier Audio Codes

Lifelong audio feature extraction involves learning new sound classes in...
research
11/13/2022

Mining Unseen Classes via Regional Objectness: A Simple Baseline for Incremental Segmentation

Incremental or continual learning has been extensively studied for image...
research
06/03/2022

Effects of Auxiliary Knowledge on Continual Learning

In Continual Learning (CL), a neural network is trained on a stream of d...
research
07/17/2021

Continual Learning for Task-oriented Dialogue System with Iterative Network Pruning, Expanding and Masking

This ability to learn consecutive tasks without forgetting how to perfor...
research
11/10/2022

Mitigating Forgetting in Online Continual Learning via Contrasting Semantically Distinct Augmentations

Online continual learning (OCL) aims to enable model learning from a non...

Please sign up or login with your details

Forgot password? Click here to reset