Reducing Representation Drift in Online Continual Learning

04/11/2021
by   Lucas Caccia, et al.
0

We study the online continual learning paradigm, where agents must learn from a changing distribution with constrained memory and compute. Previous work often tackle catastrophic forgetting by overcoming changes in the space of model parameters. In this work we instead focus on the change in representations of previously observed data due to the introduction of previously unobserved class samples in the incoming data stream. We highlight the issues that arise in the practical setting where new classes must be distinguished between all previous classes. Starting from a popular approach, experience replay, we consider a metric learning based loss function, the triplet loss, which allows us to more explicitly constrain the behavior of representations. We hypothesize and empirically confirm that the selection of negatives used in the triplet loss plays a major role in the representation change, or drift, of previously observed data and can be greatly reduced by appropriate negative selection. Motivated by this we further introduce a simple adjustment to the standard cross entropy loss used in prior experience replay that achieves similar effect. Our approach greatly improves the performance of experience replay and obtains state-of-the-art on several existing benchmarks in online continual learning, while remaining efficient in both memory and compute.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/08/2022

New Insights on Reducing Abrupt Representation Change in Online Continual Learning

In the online continual learning paradigm, agents must learn from a chan...
research
08/11/2019

Online Continual Learning with Maximally Interfered Retrieval

Continual learning, the setting where a learning agent is faced with a n...
research
08/31/2020

Adversarial Shapley Value Experience Replay for Task-Free Continual Learning

Continual learning is a branch of deep learning that seeks to strike a b...
research
10/12/2020

Rethinking Experience Replay: a Bag of Tricks for Continual Learning

In Continual Learning, a Neural Network is trained on a stream of data w...
research
04/24/2021

Class-Incremental Experience Replay for Continual Learning under Concept Drift

Modern machine learning systems need to be able to cope with constantly ...
research
10/16/2022

Navigating Memory Construction by Global Pseudo-Task Simulation for Continual Learning

Continual learning faces a crucial challenge of catastrophic forgetting....
research
03/20/2020

Online Continual Learning on Sequences

Online continual learning (OCL) refers to the ability of a system to lea...

Please sign up or login with your details

Forgot password? Click here to reset