Few-Shot Self Reminder to Overcome Catastrophic Forgetting

12/03/2018
by   Junfeng Wen, et al.
0

Deep neural networks are known to suffer the catastrophic forgetting problem, where they tend to forget the knowledge from the previous tasks when sequentially learning new tasks. Such failure hinders the application of deep learning based vision system in continual learning settings. In this work, we present a simple yet surprisingly effective way of preventing catastrophic forgetting. Our method, called Few-shot Self Reminder (FSR), regularizes the neural net from changing its learned behaviour by performing logit matching on selected samples kept in episodic memory from the old tasks. Surprisingly, this simplistic approach only requires to retrain a small amount of data in order to outperform previous methods in knowledge retention. We demonstrate the superiority of our method to the previous ones in two different continual learning settings on popular benchmarks, as well as a new continual learning problem where tasks are designed to be more dissimilar.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/16/2020

Gradient Episodic Memory with a Soft Constraint for Continual Learning

Catastrophic forgetting in continual learning is a common destructive ph...
research
04/24/2020

Dropout as an Implicit Gating Mechanism For Continual Learning

In recent years, neural networks have demonstrated an outstanding abilit...
research
12/09/2021

Provable Continual Learning via Sketched Jacobian Approximations

An important problem in machine learning is the ability to learn tasks i...
research
10/11/2022

Continual Learning by Modeling Intra-Class Variation

It has been observed that neural networks perform poorly when the data o...
research
05/23/2019

Prototype Reminding for Continual Learning

Continual learning is a critical ability of continually acquiring and tr...
research
04/19/2021

Few-shot Continual Learning: a Brain-inspired Approach

It is an important yet challenging setting to continually learn new task...
research
02/19/2021

Condensed Composite Memory Continual Learning

Deep Neural Networks (DNNs) suffer from a rapid decrease in performance ...

Please sign up or login with your details

Forgot password? Click here to reset