Continual Learning in the Teacher-Student Setup: Impact of Task Similarity

07/09/2021
by   Sebastian Lee, et al.
0

Continual learning-the ability to learn many tasks in sequence-is critical for artificial learning systems. Yet standard training methods for deep networks often suffer from catastrophic forgetting, where learning new tasks erases knowledge of earlier tasks. While catastrophic forgetting labels the problem, the theoretical reasons for interference between tasks remain unclear. Here, we attempt to narrow this gap between theory and practice by studying continual learning in the teacher-student setup. We extend previous analytical work on two-layer networks in the teacher-student setup to multiple teachers. Using each teacher to represent a different task, we investigate how the relationship between teachers affects the amount of forgetting and transfer exhibited by the student when the task switches. In line with recent work, we find that when tasks depend on similar features, intermediate task similarity leads to greatest forgetting. However, feature similarity is only one way in which tasks may be related. The teacher-student approach allows us to disentangle task similarity at the level of readouts (hidden-to-output weights) and features (input-to-hidden weights). We find a complex interplay between both types of similarity, initial transfer/forgetting rates, maximum transfer/forgetting, and long-term transfer/forgetting. Together, these results help illuminate the diverse factors contributing to catastrophic forgetting.

READ FULL TEXT
research
05/16/2021

Statistical Mechanical Analysis of Catastrophic Forgetting in Continual Learning with Teacher and Student Networks

When a computational system continuously learns from an ever-changing en...
research
05/18/2022

Maslow's Hammer for Catastrophic Forgetting: Node Re-Use vs Node Activation

Continual learning - learning new tasks in sequence while maintaining pe...
research
07/14/2020

Anatomy of Catastrophic Forgetting: Hidden Representations and Task Semantics

A central challenge in developing versatile machine learning systems is ...
research
06/01/2022

Transfer without Forgetting

This work investigates the entanglement between Continual Learning (CL) ...
research
11/27/2019

GRIm-RePR: Prioritising Generating Important Features for Pseudo-Rehearsal

Pseudo-rehearsal allows neural networks to learn a sequence of tasks wit...
research
12/03/2021

Learning Curves for Sequential Training of Neural Networks: Self-Knowledge Transfer and Forgetting

Sequential training from task to task is becoming one of the major objec...
research
02/12/2023

Theory on Forgetting and Generalization of Continual Learning

Continual learning (CL), which aims to learn a sequence of tasks, has at...

Please sign up or login with your details

Forgot password? Click here to reset