Provable Continual Learning via Sketched Jacobian Approximations

12/09/2021
by   Reinhard Heckel, et al.
0

An important problem in machine learning is the ability to learn tasks in a sequential manner. If trained with standard first-order methods most models forget previously learned tasks when trained on a new task, which is often referred to as catastrophic forgetting. A popular approach to overcome forgetting is to regularize the loss function by penalizing models that perform poorly on previous tasks. For example, elastic weight consolidation (EWC) regularizes with a quadratic form involving a diagonal matrix build based on past data. While EWC works very well for some setups, we show that, even under otherwise ideal conditions, it can provably suffer catastrophic forgetting if the diagonal matrix is a poor approximation of the Hessian matrix of previous tasks. We propose a simple approach to overcome this: Regularizing training of a new task with sketches of the Jacobian matrix of past data. This provably enables overcoming catastrophic forgetting for linear models and for wide neural networks, at the cost of memory. The overarching goal of this paper is to provided insights on when regularization-based continual learning algorithms work and under what memory costs.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/03/2018

Few-Shot Self Reminder to Overcome Catastrophic Forgetting

Deep neural networks are known to suffer the catastrophic forgetting pro...
research
08/02/2019

Weight Friction: A Simple Method to Overcome Catastrophic Forgetting and Enable Continual Learning

In recent years, deep neural networks have found success in replicating ...
research
07/20/2023

Self-paced Weight Consolidation for Continual Learning

Continual learning algorithms which keep the parameters of new tasks clo...
research
05/26/2023

Mitigating Catastrophic Forgetting in Long Short-Term Memory Networks

Continual learning on sequential data is critical for many machine learn...
research
04/17/2021

Lifelong Learning with Sketched Structural Regularization

Preventing catastrophic forgetting while continually learning new tasks ...
research
04/29/2020

Continual Deep Learning by Functional Regularisation of Memorable Past

Continually learning new skills is important for intelligent systems, ye...
research
06/22/2020

A sparse code for neuro-dynamic programming and optimal control

Sparse codes have been suggested to offer certain computational advantag...

Please sign up or login with your details

Forgot password? Click here to reset