Contrastive Learning for Online Semi-Supervised General Continual Learning

07/12/2022
by   Nicolas Michel, et al.
0

We study Online Continual Learning with missing labels and propose SemiCon, a new contrastive loss designed for partly labeled data. We demonstrate its efficiency by devising a memory-based method trained on an unlabeled data stream, where every data added to memory is labeled using an oracle. Our approach outperforms existing semi-supervised methods when few labels are available, and obtain similar results to state-of-the-art supervised methods while using only 2.6 Split-CIFAR100.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/23/2022

Learning to Predict Gradients for Semi-Supervised Continual Learning

A key challenge for machine intelligence is to learn new visual concepts...
research
04/27/2022

Executive Function: A Contrastive Value Policy for Resampling and Relabeling Perceptions via Hindsight Summarization?

We develop the few-shot continual learning task from first principles an...
research
03/02/2023

Ego-Vehicle Action Recognition based on Semi-Supervised Contrastive Learning

In recent years, many automobiles have been equipped with cameras, which...
research
01/02/2021

ORDisCo: Effective and Efficient Usage of Incremental Unlabeled Data for Semi-supervised Continual Learning

Continual learning usually assumes the incoming data are fully labeled, ...
research
01/11/2023

A Distinct Unsupervised Reference Model From The Environment Helps Continual Learning

The existing continual learning methods are mainly focused on fully-supe...
research
12/09/2022

A soft nearest-neighbor framework for continual semi-supervised learning

Despite significant advances, the performance of state-of-the-art contin...
research
09/12/2023

Plasticity-Optimized Complementary Networks for Unsupervised Continual Learning

Continuous unsupervised representation learning (CURL) research has grea...

Please sign up or login with your details

Forgot password? Click here to reset