Maslow's Hammer for Catastrophic Forgetting: Node Re-Use vs Node Activation

05/18/2022
by   Sebastian Lee, et al.
0

Continual learning - learning new tasks in sequence while maintaining performance on old tasks - remains particularly challenging for artificial neural networks. Surprisingly, the amount of forgetting does not increase with the dissimilarity between the learned tasks, but appears to be worst in an intermediate similarity regime. In this paper we theoretically analyse both a synthetic teacher-student framework and a real data setup to provide an explanation of this phenomenon that we name Maslow's hammer hypothesis. Our analysis reveals the presence of a trade-off between node activation and node re-use that results in worst forgetting in the intermediate regime. Using this understanding we reinterpret popular algorithmic interventions for catastrophic interference in terms of this trade-off, and identify the regimes in which they are most effective.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/09/2021

Continual Learning in the Teacher-Student Setup: Impact of Task Similarity

Continual learning-the ability to learn many tasks in sequence-is critic...
research
05/16/2021

Statistical Mechanical Analysis of Catastrophic Forgetting in Continual Learning with Teacher and Student Networks

When a computational system continuously learns from an ever-changing en...
research
03/12/2022

Sparsity and Heterogeneous Dropout for Continual Learning in the Null Space of Neural Activations

Continual/lifelong learning from a non-stationary input data stream is a...
research
06/01/2023

Out-of-distribution forgetting: vulnerability of continual learning to intra-class distribution shift

Continual learning (CL) is an important technique to allow artificial ne...
research
11/27/2019

GRIm-RePR: Prioritising Generating Important Features for Pseudo-Rehearsal

Pseudo-rehearsal allows neural networks to learn a sequence of tasks wit...
research
02/15/2021

Does Standard Backpropagation Forget Less Catastrophically Than Adam?

Catastrophic forgetting remains a severe hindrance to the broad applicat...
research
09/07/2022

The Role Of Biology In Deep Learning

Artificial neural networks took a lot of inspiration from their biologic...

Please sign up or login with your details

Forgot password? Click here to reset