A Generalized Supervised Contrastive Learning Framework

06/01/2022
by   Jaewon Kim, et al.
0

Based on recent remarkable achievements of contrastive learning in self-supervised representation learning, supervised contrastive learning (SupCon) has successfully extended the batch contrastive approaches to the supervised context and outperformed cross-entropy on various datasets on ResNet. In this work, we present GenSCL: a generalized supervised contrastive learning framework that seamlessly adapts modern image-based regularizations (such as Mixup-Cutmix) and knowledge distillation (KD) to SupCon by our generalized supervised contrastive loss. Generalized supervised contrastive loss is a further extension of supervised contrastive loss measuring cross-entropy between the similarity of labels and that of latent features. Then a model can learn to what extent contrastives should be pulled closer to an anchor in the latent space. By explicitly and fully leveraging label information, GenSCL breaks the boundary between conventional positives and negatives, and any kind of pre-trained teacher classifier can be utilized. ResNet-50 trained in GenSCL with Mixup-Cutmix and KD achieves state-of-the-art accuracies of 97.6 which significantly improves the results reported in the original SupCon (1.6 and 8.2 https://t.ly/yuUO.

READ FULL TEXT
research
05/18/2023

Tuned Contrastive Learning

In recent times, contrastive learning based loss functions have become i...
research
10/11/2021

SCEHR: Supervised Contrastive Learning for Clinical Risk Prediction using Electronic Health Records

Contrastive learning has demonstrated promising performance in image and...
research
10/19/2021

Momentum Contrastive Autoencoder: Using Contrastive Learning for Latent Space Distribution Matching in WAE

Wasserstein autoencoder (WAE) shows that matching two distributions is e...
research
05/07/2021

ConCAD: Contrastive Learning-based Cross Attention for Sleep Apnea Detection

With recent advancements in deep learning methods, automatically learnin...
research
04/22/2022

Universum-inspired Supervised Contrastive Learning

Mixup is an efficient data augmentation method which generates additiona...
research
06/29/2021

Self-Contrastive Learning

This paper proposes a novel contrastive learning framework, coined as Se...
research
07/11/2022

Brain-Aware Replacements for Supervised Contrastive Learning

We propose a novel framework for Alzheimer's disease (AD) detection usin...

Please sign up or login with your details

Forgot password? Click here to reset