DeepAI
Log In Sign Up

A Theoretical Analysis of Catastrophic Forgetting through the NTK Overlap Matrix

10/07/2020
by   Thang Doan, et al.
0

Continual learning (CL) is a setting in which an agent has to learn from an incoming stream of data during its entire lifetime. Although major advances have been made in the field, one recurring problem which remains unsolved is that of Catastrophic Forgetting (CF). While the issue has been extensively studied empirically, little attention has been paid from a theoretical angle. In this paper, we show that the impact of CF increases as two tasks increasingly align. We introduce a measure of task similarity called the NTK overlap matrix which is at the core of CF. We analyze common projected gradient algorithms and demonstrate how they mitigate forgetting. Then, we propose a variant of Orthogonal Gradient Descent (OGD) which leverages structure of the data through Principal Component Analysis (PCA). Experiments support our theoretical findings and show how our method reduces CF on classical CL datasets.

READ FULL TEXT

page 17

page 18

page 19

page 20

page 21

page 26

06/21/2020

Generalisation Guarantees for Continual Learning with Orthogonal Gradient Descent

In continual learning settings, deep neural networks are prone to catast...
04/26/2022

Theoretical Understanding of the Information Flow on Continual Learning Performance

Continual learning (CL) is a setting in which an agent has to learn from...
10/15/2019

Orthogonal Gradient Descent for Continual Learning

Neural networks are achieving state of the art and sometimes super-human...
03/22/2021

Catastrophic Forgetting in Deep Graph Networks: an Introductory Benchmark for Graph Classification

In this work, we study the phenomenon of catastrophic forgetting in the ...
01/23/2020

Structured Compression and Sharing of Representational Space for Continual Learning

Humans are skilled at learning adaptively and efficiently throughout the...
10/15/2019

Embodiment dictates learnability in neural controllers

Catastrophic forgetting continues to severely restrict the learnability ...