Probing Representation Forgetting in Supervised and Unsupervised Continual Learning

03/24/2022
by   MohammadReza Davari, et al.
6

Continual Learning research typically focuses on tackling the phenomenon of catastrophic forgetting in neural networks. Catastrophic forgetting is associated with an abrupt loss of knowledge previously learned by a model when the task, or more broadly the data distribution, being trained on changes. In supervised learning problems this forgetting, resulting from a change in the model's representation, is typically measured or observed by evaluating the decrease in old task performance. However, a model's representation can change without losing knowledge about prior tasks. In this work we consider the concept of representation forgetting, observed by using the difference in performance of an optimal linear classifier before and after a new task is introduced. Using this tool we revisit a number of standard continual learning benchmarks and observe that, through this lens, model representations trained without any explicit control for forgetting often experience small representation forgetting and can sometimes be comparable to methods which explicitly control for forgetting, especially in longer task sequences. We also show that representation forgetting can lead to new insights on the effect of model capacity and loss function used in continual learning. Based on our results, we show that a simple yet competitive approach is to learn representations continually with standard supervised contrastive learning while constructing prototypes of class samples when queried on old samples.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/18/2021

Does Continual Learning = Catastrophic Forgetting?

Continual learning is known for suffering from catastrophic forgetting, ...
research
10/13/2021

Rethinking the Representational Continuity: Towards Unsupervised Continual Learning

Continual learning (CL) aims to learn a sequence of tasks without forget...
research
06/19/2020

SOLA: Continual Learning with Second-Order Loss Approximation

Neural networks have achieved remarkable success in many cognitive tasks...
research
06/18/2023

IF2Net: Innately Forgetting-Free Networks for Continual Learning

Continual learning can incrementally absorb new concepts without interfe...
research
09/12/2023

Plasticity-Optimized Complementary Networks for Unsupervised Continual Learning

Continuous unsupervised representation learning (CURL) research has grea...
research
09/29/2020

One Person, One Model, One World: Learning Continual User Representation without Forgetting

Learning generic user representations which can then be applied to other...
research
10/17/2021

Growing Representation Learning

Machine learning continues to grow in popularity due to its ability to l...

Please sign up or login with your details

Forgot password? Click here to reset