Towards Robust Feature Learning with t-vFM Similarity for Continual Learning

06/04/2023
by   Bilan Gao, et al.
0

Continual learning has been developed using standard supervised contrastive loss from the perspective of feature learning. Due to the data imbalance during the training, there are still challenges in learning better representations. In this work, we suggest using a different similarity metric instead of cosine similarity in supervised contrastive loss in order to learn more robust representations. We validate the our method on one of the image classification datasets Seq-CIFAR-10 and the results outperform recent continual learning baselines.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/28/2021

Co^2L: Contrastive Continual Learning

Recent breakthroughs in self-supervised learning show that such algorith...
research
12/03/2021

Contrastive Continual Learning with Feature Propagation

Classical machine learners are designed only to tackle one task without ...
research
02/07/2022

Dataset Condensation with Contrastive Signals

Recent studies have demonstrated that gradient matching-based dataset sy...
research
05/24/2018

Towards Robust Evaluations of Continual Learning

Continual learning experiments used in current deep learning papers do n...
research
06/26/2023

Histopathology Image Classification using Deep Manifold Contrastive Learning

Contrastive learning has gained popularity due to its robustness with go...
research
10/09/2020

Continual learning using hash-routed convolutional neural networks

Continual learning could shift the machine learning paradigm from data c...
research
04/18/2019

Continual Learning for Sentence Representations Using Conceptors

Distributed representations of sentences have become ubiquitous in natur...

Please sign up or login with your details

Forgot password? Click here to reset