Wide Neural Networks Forget Less Catastrophically

10/21/2021
by   Seyed-Iman Mirzadeh, et al.
0

A growing body of research in continual learning is devoted to overcoming the "Catastrophic Forgetting" of neural networks by designing new algorithms that are more robust to the distribution shifts. While the recent progress in continual learning literature is encouraging, our understanding of what properties of neural networks contribute to catastrophic forgetting is still limited. To address this, instead of focusing on continual learning algorithms, in this work, we focus on the model itself and study the impact of "width" of the neural network architecture on catastrophic forgetting, and show that width has a surprisingly significant effect on forgetting. To explain this effect, we study the learning dynamics of the network from various perspectives such as gradient norm and sparsity, orthogonalization, and lazy training regime. We provide potential explanations that are consistent with the empirical results across different architectures and continual learning benchmarks.

READ FULL TEXT
research
02/01/2022

Architecture Matters in Continual Learning

A large body of research in continual learning is devoted to overcoming ...
research
03/29/2023

How Efficient Are Today's Continual Learning Algorithms?

Supervised Continual learning involves updating a deep neural network (D...
research
11/21/2022

On the Robustness, Generalization, and Forgetting of Shape-Texture Debiased Continual Learning

Tremendous progress has been made in continual learning to maintain good...
research
05/17/2021

Continual Learning with Echo State Networks

Continual Learning (CL) refers to a learning setup where data is non sta...
research
08/02/2019

Toward Understanding Catastrophic Forgetting in Continual Learning

We study the relationship between catastrophic forgetting and properties...
research
08/27/2023

Universal Graph Continual Learning

We address catastrophic forgetting issues in graph learning as incoming ...
research
10/04/2020

Remembering for the Right Reasons: Explanations Reduce Catastrophic Forgetting

The goal of continual learning (CL) is to learn a sequence of tasks with...

Please sign up or login with your details

Forgot password? Click here to reset