The Ideal Continual Learner: An Agent That Never Forgets

04/29/2023
by   Liangzu Peng, et al.
0

The goal of continual learning is to find a model that solves multiple learning tasks which are presented sequentially to the learner. A key challenge in this setting is that the learner may forget how to solve a previous task when learning a new task, a phenomenon known as catastrophic forgetting. To address this challenge, many practical methods have been proposed, including memory-based, regularization-based, and expansion-based methods. However, a rigorous theoretical understanding of these methods remains elusive. This paper aims to bridge this gap between theory and practice by proposing a new continual learning framework called Ideal Continual Learner (ICL), which is guaranteed to avoid catastrophic forgetting by construction. We show that ICL unifies multiple well-established continual learning methods and gives new theoretical insights into the strengths and weaknesses of these methods. We also derive generalization bounds for ICL which allow us to theoretically quantify how rehearsal affects generalization. Finally, we connect ICL to several classic subjects and research topics of modern interest, which allows us to make historical remarks and inspire future directions.

READ FULL TEXT
research
04/22/2022

Memory Bounds for Continual Learning

Continual learning, or lifelong learning, is a formidable current challe...
research
04/15/2021

Rehearsal revealed: The limits and merits of revisiting samples in continual learning

Learning from non-stationary data streams and overcoming catastrophic fo...
research
07/20/2023

Self-paced Weight Consolidation for Continual Learning

Continual learning algorithms which keep the parameters of new tasks clo...
research
08/05/2020

Meta Continual Learning via Dynamic Programming

Meta-continual learning algorithms seek to rapidly train a model when fa...
research
03/27/2022

Continual learning: a feature extraction formalization, an efficient algorithm, and fundamental obstructions

Continual learning is an emerging paradigm in machine learning, wherein ...
research
10/16/2022

Navigating Memory Construction by Global Pseudo-Task Simulation for Continual Learning

Continual learning faces a crucial challenge of catastrophic forgetting....
research
06/11/2020

Understanding Regularisation Methods for Continual Learning

The problem of Catastrophic Forgetting has received a lot of attention i...

Please sign up or login with your details

Forgot password? Click here to reset