Is forgetting less a good inductive bias for forward transfer?

03/14/2023
by   Jiefeng Chen, et al.
0

One of the main motivations of studying continual learning is that the problem setting allows a model to accrue knowledge from past tasks to learn new tasks more efficiently. However, recent studies suggest that the key metric that continual learning algorithms optimize, reduction in catastrophic forgetting, does not correlate well with the forward transfer of knowledge. We believe that the conclusion previous works reached is due to the way they measure forward transfer. We argue that the measure of forward transfer to a task should not be affected by the restrictions placed on the continual learner in order to preserve knowledge of previous tasks. Instead, forward transfer should be measured by how easy it is to learn a new task given a set of representations produced by continual learning on previous tasks. Under this notion of forward transfer, we evaluate different continual learning algorithms on a variety of image classification benchmarks. Our results indicate that less forgetful representations lead to a better forward transfer suggesting a strong correlation between retaining past information and learning efficiency on new tasks. Further, we found less forgetful representations to be more diverse and discriminative compared to their forgetful counterparts.

READ FULL TEXT

page 6

page 18

page 19

page 21

page 22

page 23

page 24

page 25

research
02/02/2023

Online Continual Learning via the Knowledge Invariant and Spread-out Properties

The goal of continual learning is to provide intelligent agents that are...
research
08/14/2022

A Theory for Knowledge Transfer in Continual Learning

Continual learning of a stream of tasks is an active area in deep neural...
research
09/16/2020

Measuring Information Transfer in Neural Networks

Estimation of the information content in a neural network model can be p...
research
07/20/2023

Self-paced Weight Consolidation for Continual Learning

Continual learning algorithms which keep the parameters of new tasks clo...
research
03/21/2023

Continual Learning in the Presence of Spurious Correlation

Most continual learning (CL) algorithms have focused on tackling the sta...
research
06/03/2022

Effects of Auxiliary Knowledge on Continual Learning

In Continual Learning (CL), a neural network is trained on a stream of d...
research
06/11/2020

Understanding Regularisation Methods for Continual Learning

The problem of Catastrophic Forgetting has received a lot of attention i...

Please sign up or login with your details

Forgot password? Click here to reset