Statistical Mechanical Analysis of Catastrophic Forgetting in Continual Learning with Teacher and Student Networks

05/16/2021
by   Haruka Asanuma, et al.
0

When a computational system continuously learns from an ever-changing environment, it rapidly forgets its past experiences. This phenomenon is called catastrophic forgetting. While a line of studies has been proposed with respect to avoiding catastrophic forgetting, most of the methods are based on intuitive insights into the phenomenon, and their performances have been evaluated by numerical experiments using benchmark datasets. Therefore, in this study, we provide the theoretical framework for analyzing catastrophic forgetting by using teacher-student learning. Teacher-student learning is a framework in which we introduce two neural networks: one neural network is a target function in supervised learning, and the other is a learning neural network. To analyze continual learning in the teacher-student framework, we introduce the similarity of the input distribution and the input-output relationship of the target functions as the similarity of tasks. In this theoretical framework, we also provide a qualitative understanding of how a single-layer linear learning neural network forgets tasks. Based on the analysis, we find that the network can avoid catastrophic forgetting when the similarity among input distributions is small and that of the input-output relationship of the target functions is large. The analysis also suggests that a system often exhibits a characteristic phenomenon called overshoot, which means that even if the learning network has once undergone catastrophic forgetting, it is possible that the network may perform reasonably well after further learning of the current task.

READ FULL TEXT
research
07/09/2021

Continual Learning in the Teacher-Student Setup: Impact of Task Similarity

Continual learning-the ability to learn many tasks in sequence-is critic...
research
05/18/2022

Maslow's Hammer for Catastrophic Forgetting: Node Re-Use vs Node Activation

Continual learning - learning new tasks in sequence while maintaining pe...
research
03/12/2022

Sparsity and Heterogeneous Dropout for Continual Learning in the Null Space of Neural Activations

Continual/lifelong learning from a non-stationary input data stream is a...
research
06/16/2019

Conditional Computation for Continual Learning

Catastrophic forgetting of connectionist neural networks is caused by th...
research
10/07/2020

A Theoretical Analysis of Catastrophic Forgetting through the NTK Overlap Matrix

Continual learning (CL) is a setting in which an agent has to learn from...
research
09/01/2022

An Incremental Learning framework for Large-scale CTR Prediction

In this work we introduce an incremental learning framework for Click-Th...
research
05/19/2022

How catastrophic can catastrophic forgetting be in linear regression?

To better understand catastrophic forgetting, we study fitting an overpa...

Please sign up or login with your details

Forgot password? Click here to reset