Rehearsal revealed: The limits and merits of revisiting samples in continual learning

04/15/2021
by   Eli Verwimp, et al.
0

Learning from non-stationary data streams and overcoming catastrophic forgetting still poses a serious challenge for machine learning research. Rather than aiming to improve state-of-the-art, in this work we provide insight into the limits and merits of rehearsal, one of continual learning's most established methods. We hypothesize that models trained sequentially with rehearsal tend to stay in the same low-loss region after a task has finished, but are at risk of overfitting on its sample memory, hence harming generalization. We provide both conceptual and strong empirical evidence on three benchmarks for both behaviors, bringing novel insights into the dynamics of rehearsal and continual learning in general. Finally, we interpret important continual learning works in the light of our findings, allowing for a deeper understanding of their successes.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

05/17/2021

Continual Learning with Echo State Networks

Continual Learning (CL) refers to a learning setup where data is non sta...
06/04/2021

A Procedural World Generation Framework for Systematic Evaluation of Continual Learning

Several families of continual learning techniques have been proposed to ...
01/18/2022

Continual Learning for CTR Prediction: A Hybrid Approach

Click-through rate(CTR) prediction is a core task in cost-per-click(CPC)...
12/13/2021

Ex-Model: Continual Learning from a Stream of Trained Models

Learning continually from non-stationary data streams is a challenging r...
11/25/2020

Continual learning with direction-constrained optimization

This paper studies a new design of the optimization algorithm for traini...
10/07/2021

CLEVA-Compass: A Continual Learning EValuation Assessment Compass to Promote Research Transparency and Comparability

What is the state of the art in continual machine learning? Although a n...
09/20/2020

Instance exploitation for learning temporary concepts from sparsely labeled drifting data streams

Continual learning from streaming data sources becomes more and more pop...

Code Repositories

RehearsalRevealed

Official codebase of the "Rehearsal revealed:The limits and merits of revisiting samples in continual learning" paper.


view repo
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.