SPeCiaL: Self-Supervised Pretraining for Continual Learning

06/16/2021
by   Lucas Caccia, et al.
8

This paper presents SPeCiaL: a method for unsupervised pretraining of representations tailored for continual learning. Our approach devises a meta-learning objective that differentiates through a sequential learning process. Specifically, we train a linear model over the representations to match different augmented views of the same image together, each view presented sequentially. The linear model is then evaluated on both its ability to classify images it just saw, and also on images from previous iterations. This gives rise to representations that favor quick knowledge retention with minimal forgetting. We evaluate SPeCiaL in the Continual Few-Shot Learning setting, and show that it can match or outperform other supervised pretraining approaches.

READ FULL TEXT

page 3

page 6

research
07/26/2021

Continual-wav2vec2: an Application of Continual Learning for Self-Supervised Automatic Speech Recognition

We present a method for continual learning of speech representations for...
research
09/13/2023

Domain-Aware Augmentations for Unsupervised Online General Continual Learning

Continual Learning has been challenging, especially when dealing with un...
research
02/21/2020

Learning to Continually Learn

Continual lifelong learning requires an agent or model to learn many seq...
research
01/28/2021

Generalising via Meta-Examples for Continual Learning in the Wild

Learning quickly and continually is still an ambitious task for neural n...
research
06/06/2023

Continual Learning in Linear Classification on Separable Data

We analyze continual learning on a sequence of separable linear classifi...
research
06/02/2021

Personalizing Pre-trained Models

Self-supervised or weakly supervised models trained on large-scale datas...

Please sign up or login with your details

Forgot password? Click here to reset