Mitigating Forgetting in Online Continual Learning via Contrasting Semantically Distinct Augmentations

11/10/2022
by   Sheng-Feng Yu, et al.
0

Online continual learning (OCL) aims to enable model learning from a non-stationary data stream to continuously acquire new knowledge as well as retain the learnt one, under the constraints of having limited system size and computational cost, in which the main challenge comes from the "catastrophic forgetting" issue – the inability to well remember the learnt knowledge while learning the new ones. With the specific focus on the class-incremental OCL scenario, i.e. OCL for classification, the recent advance incorporates the contrastive learning technique for learning more generalised feature representation to achieve the state-of-the-art performance but is still unable to fully resolve the catastrophic forgetting. In this paper, we follow the strategy of adopting contrastive learning but further introduce the semantically distinct augmentation technique, in which it leverages strong augmentation to generate more data samples, and we show that considering these samples semantically different from their original classes (thus being related to the out-of-distribution samples) in the contrastive learning mechanism contributes to alleviate forgetting and facilitate model stability. Moreover, in addition to contrastive learning, the typical classification mechanism and objective (i.e. softmax classifier and cross-entropy loss) are included in our model design for faster convergence and utilising the label information, but particularly equipped with a sampling strategy to tackle the tendency of favouring the new classes (i.e. model bias towards the recently learnt classes). Upon conducting extensive experiments on CIFAR-10, CIFAR-100, and Mini-Imagenet datasets, our proposed method is shown to achieve superior performance against various baselines.

READ FULL TEXT
research
07/24/2022

Online Continual Learning with Contrastive Vision Transformer

Online continual learning (online CL) studies the problem of learning se...
research
03/22/2021

Supervised Contrastive Replay: Revisiting the Nearest Class Mean Classifier in Online Class-Incremental Continual Learning

Online class-incremental continual learning (CL) studies the problem of ...
research
04/28/2021

Preserving Earlier Knowledge in Continual Learning with the Help of All Previous Feature Extractors

Continual learning of new knowledge over time is one desirable capabilit...
research
10/26/2021

Brain-inspired feature exaggeration in generative replay for continual learning

The catastrophic forgetting of previously learnt classes is one of the m...
research
02/07/2022

Dataset Condensation with Contrastive Signals

Recent studies have demonstrated that gradient matching-based dataset sy...
research
04/09/2021

Unsupervised Class-Incremental Learning Through Confusion

While many works on Continual Learning have shown promising results for ...
research
04/19/2022

Learning to Imagine: Diversify Memory for Incremental Learning using Unlabeled Data

Deep neural network (DNN) suffers from catastrophic forgetting when lear...

Please sign up or login with your details

Forgot password? Click here to reset