Gradient-matching coresets for continual learning

12/09/2021
by   Lukas Balles, et al.
0

We devise a coreset selection method based on the idea of gradient matching: The gradients induced by the coreset should match, as closely as possible, those induced by the original training dataset. We evaluate the method in the context of continual learning, where it can be used to curate a rehearsal memory. Our method performs strong competitors such as reservoir sampling across a range of memory sizes.

READ FULL TEXT
research
03/28/2022

Gradient-Matching Coresets for Rehearsal-Based Continual Learning

The goal of continual learning (CL) is to efficiently update a machine l...
research
08/21/2021

Principal Gradient Direction and Confidence Reservoir Sampling for Continual Learning

Task-free online continual learning aims to alleviate catastrophic forge...
research
04/10/2022

Information-theoretic Online Memory Selection for Continual Learning

A challenging problem in task-free continual learning is the online sele...
research
03/02/2022

Continual Learning of Multi-modal Dynamics with External Memory

We study the problem of fitting a model to a dynamical environment when ...
research
03/20/2023

Sparse Distributed Memory is a Continual Learner

Continual learning is a problem for artificial neural networks that thei...
research
12/06/2018

Continual Learning Augmented Investment Decisions

Investment decisions can benefit from incorporating an accumulated knowl...
research
07/24/2020

Mind Your Manners! A Dataset and A Continual Learning Approach for Assessing Social Appropriateness of Robot Actions

To date, endowing robots with an ability to assess social appropriatenes...

Please sign up or login with your details

Forgot password? Click here to reset