Toward Understanding Catastrophic Forgetting in Continual Learning

08/02/2019
by   Cuong V. Nguyen, et al.
8

We study the relationship between catastrophic forgetting and properties of task sequences. In particular, given a sequence of tasks, we would like to understand which properties of this sequence influence the error rates of continual learning algorithms trained on the sequence. To this end, we propose a new procedure that makes use of recent developments in task space modeling as well as correlation analysis to specify and analyze the properties we are interested in. As an application, we apply our procedure to study two properties of a task sequence: (1) total complexity and (2) sequential heterogeneity. We show that error rates are strongly and positively correlated to a task sequence's total complexity for some state-of-the-art algorithms. We also show that, surprisingly, the error rates have no or even negative correlations in some cases to sequential heterogeneity. Our findings suggest directions for improving continual learning benchmarks and methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/21/2021

Wide Neural Networks Forget Less Catastrophically

A growing body of research in continual learning is devoted to overcomin...
research
04/07/2020

Class-Agnostic Continual Learning of Alternating Languages and Domains

Continual Learning has been often framed as the problem of training a mo...
research
03/02/2022

Continual Feature Selection: Spurious Features in Continual Learning

Continual Learning (CL) is the research field addressing learning settin...
research
10/06/2020

Sequential Changepoint Detection in Neural Networks with Checkpoints

We introduce a framework for online changepoint detection and simultaneo...
research
10/09/2020

Linear Mode Connectivity in Multitask and Continual Learning

Continual (sequential) training and multitask (simultaneous) training ar...
research
12/03/2021

Learning Curves for Sequential Training of Neural Networks: Self-Knowledge Transfer and Forgetting

Sequential training from task to task is becoming one of the major objec...
research
05/25/2023

SketchOGD: Memory-Efficient Continual Learning

When machine learning models are trained continually on a sequence of ta...

Please sign up or login with your details

Forgot password? Click here to reset