Continual Contrastive Self-supervised Learning for Image Classification

07/05/2021
by   Zhiwei Lin, et al.
0

For artificial learning systems, continual learning over time from a stream of data is essential. The burgeoning studies on supervised continual learning have achieved great progress, while the study of catastrophic forgetting in unsupervised learning is still blank. Among unsupervised learning methods, self-supervise learning method shows tremendous potential on visual representation without any labeled data at scale. To improve the visual representation of self-supervised learning, larger and more varied data is needed. In the real world, unlabeled data is generated at all times. This circumstance provides a huge advantage for the learning of the self-supervised method. However, in the current paradigm, packing previous data and current data together and training it again is a waste of time and resources. Thus, a continual self-supervised learning method is badly needed. In this paper, we make the first attempt to implement the continual contrastive self-supervised learning by proposing a rehearsal method, which keeps a few exemplars from the previous data. Instead of directly combining saved exemplars with the current data set for training, we leverage self-supervised knowledge distillation to transfer contrastive information among previous data to the current network by mimicking similarity score distribution inferred by the old network over a set of saved exemplars. Moreover, we build an extra sample queue to assist the network to distinguish between previous and current data and prevent mutual interference while learning their own feature representation. Experimental results show that our method performs well on CIFAR100 and ImageNet-Sub. Compared with the baselines, which learning tasks without taking any technique, we improve the image classification top-1 accuracy by 1.60 on ImageNet-Sub and 1.29

READ FULL TEXT
research
01/12/2021

SEED: Self-supervised Distillation For Visual Representation

This paper is concerned with self-supervised learning for small models. ...
research
06/28/2021

Co^2L: Contrastive Continual Learning

Recent breakthroughs in self-supervised learning show that such algorith...
research
10/06/2022

Brief Introduction to Contrastive Learning Pretext Tasks for Visual Representation

To improve performance in visual feature representation from photos or v...
research
05/05/2023

On the Effectiveness of Equivariant Regularization for Robust Online Continual Learning

Humans can learn incrementally, whereas neural networks forget previousl...
research
05/23/2022

Continual Barlow Twins: continual self-supervised learning for remote sensing semantic segmentation

In the field of Earth Observation (EO), Continual Learning (CL) algorith...
research
09/30/2022

Slimmable Networks for Contrastive Self-supervised Learning

Self-supervised learning makes great progress in large model pre-trainin...
research
07/06/2020

In the Wild: From ML Models to Pragmatic ML Systems

Enabling robust intelligence in the wild entails learning systems that o...

Please sign up or login with your details

Forgot password? Click here to reset