Universum-inspired Supervised Contrastive Learning

04/22/2022
by   Aiyang Han, et al.
0

Mixup is an efficient data augmentation method which generates additional samples through respective convex combinations of original data points and labels. Although being theoretically dependent on data properties, Mixup is reported to perform well as a regularizer and calibrator contributing reliable robustness and generalization to neural network training. In this paper, inspired by Universum Learning which uses out-of-class samples to assist the target tasks, we investigate Mixup from a largely under-explored perspective - the potential to generate in-domain samples that belong to none of the target classes, that is, universum. We find that in the framework of supervised contrastive learning, universum-style Mixup produces surprisingly high-quality hard negatives, greatly relieving the need for a large batch size in contrastive learning. With these findings, we propose Universum-inspired Contrastive learning (UniCon), which incorporates Mixup strategy to generate universum data as g-negatives and pushes them apart from anchor samples of the target classes. Our approach not only improves Mixup with hard labels, but also innovates a novel measure to generate universum data. With a linear classifier on the learned representations, our method achieves 81.68 CIFAR-100, surpassing the state of art by a significant margin of 5 much smaller batch size, typically, 256 in UniCon vs. 1024 in SupCon using ResNet-50.

READ FULL TEXT
research
10/09/2020

Contrastive Learning with Hard Negative Samples

We consider the question: how can you sample good negative examples for ...
research
06/01/2022

A Generalized Supervised Contrastive Learning Framework

Based on recent remarkable achievements of contrastive learning in self-...
research
04/11/2021

Constructing Contrastive samples via Summarization for Text Classification with limited annotations

Contrastive Learning has emerged as a powerful representation learning m...
research
03/20/2022

Partitioning Image Representation in Contrastive Learning

In contrastive learning in the image domain, the anchor and positive sam...
research
09/29/2021

Cross-domain Semi-Supervised Audio Event Classification Using Contrastive Regularization

In this study, we proposed a novel semi-supervised training method that ...
research
04/15/2022

Perfectly Balanced: Improving Transfer and Robustness of Supervised Contrastive Learning

An ideal learned representation should display transferability and robus...
research
07/16/2022

Model-Aware Contrastive Learning: Towards Escaping Uniformity-Tolerance Dilemma in Training

Instance discrimination contrastive learning (CL) has achieved significa...

Please sign up or login with your details

Forgot password? Click here to reset