DeepAI AI Chat
Log In Sign Up

How catastrophic can catastrophic forgetting be in linear regression?

05/19/2022
by   Itay Evron, et al.
27

To better understand catastrophic forgetting, we study fitting an overparameterized linear model to a sequence of tasks with different input distributions. We analyze how much the model forgets the true labels of earlier tasks after training on subsequent tasks, obtaining exact expressions and bounds. We establish connections between continual learning in the linear setting and two other research areas: alternating projections and the Kaczmarz method. In specific settings, we highlight differences between forgetting and convergence to the offline solution as studied in those areas. In particular, when T tasks in d dimensions are presented cyclically for k iterations, we prove an upper bound of T^2 * min1/sqrt(k), d/k on the forgetting. This stands in contrast to the convergence to the offline solution, which can be arbitrarily slow according to existing alternating projection results. We further show that the T^2 factor can be lifted when tasks are presented in a random ordering.

READ FULL TEXT

page 1

page 2

page 3

page 4

06/06/2019

Localizing Catastrophic Forgetting in Neural Networks

Artificial neural networks (ANNs) suffer from catastrophic forgetting wh...
05/25/2023

Overcoming Catastrophic Forgetting in Massively Multilingual Continual Learning

Real-life multilingual systems should be able to efficiently incorporate...
07/09/2021

Continual Learning in the Teacher-Student Setup: Impact of Task Similarity

Continual learning-the ability to learn many tasks in sequence-is critic...
01/06/2020

Dissecting Catastrophic Forgetting in Continual Learning by Deep Visualization

Interpreting the behaviors of Deep Neural Networks (usually considered a...
05/03/2020

Explaining How Deep Neural Networks Forget by Deep Visualization

Explaining the behaviors of deep neural networks, usually considered as ...
03/17/2023

Fixed Design Analysis of Regularization-Based Continual Learning

We consider a continual learning (CL) problem with two linear regression...
10/31/2022

Reduce Catastrophic Forgetting of Dense Retrieval Training with Teleportation Negatives

In this paper, we investigate the instability in the standard dense retr...