Extending Contrastive Learning to Unsupervised Coreset Selection

03/05/2021
by   Jeongwoo Ju, et al.
0

Self-supervised contrastive learning offers a means of learning informative features from a pool of unlabeled data. In this paper, we delve into another useful approach – providing a way of selecting a core-set that is entirely unlabeled. In this regard, contrastive learning, one of a large number of self-supervised methods, was recently proposed and has consistently delivered the highest performance. This prompted us to choose two leading methods for contrastive learning: the simple framework for contrastive learning of visual representations (SimCLR) and the momentum contrastive (MoCo) learning framework. We calculated the cosine similarities for each example of an epoch for the entire duration of the contrastive learning process and subsequently accumulated the cosine-similarity values to obtain the coreset score. Our assumption was that an sample with low similarity would likely behave as a coreset. Compared with existing coreset selection methods with labels, our approach reduced the cost associated with human annotation. The unsupervised method implemented in this study for coreset selection obtained improved results over a randomly chosen subset, and were comparable to existing supervised coreset selection on various classification datasets (e.g., CIFAR, SVHN, and QMNIST).

READ FULL TEXT

page 8

page 9

research
12/21/2021

Augmented Contrastive Self-Supervised Learning for Audio Invariant Representations

Improving generalization is a major challenge in audio classification du...
research
02/25/2022

Refining Self-Supervised Learning in Imaging: Beyond Linear Metric

We introduce in this paper a new statistical perspective, exploiting the...
research
03/10/2023

Improving Domain-Invariance in Self-Supervised Learning via Batch Styles Standardization

The recent rise of Self-Supervised Learning (SSL) as one of the preferre...
research
03/03/2021

Contrastive learning of strong-mixing continuous-time stochastic processes

Contrastive learning is a family of self-supervised methods where a mode...
research
06/16/2022

Volumetric Supervised Contrastive Learning for Seismic Semantic Segmentation

In seismic interpretation, pixel-level labels of various rock structures...
research
10/06/2022

Brief Introduction to Contrastive Learning Pretext Tasks for Visual Representation

To improve performance in visual feature representation from photos or v...
research
06/07/2021

Enabling On-Device Self-Supervised Contrastive Learning With Selective Data Contrast

After a model is deployed on edge devices, it is desirable for these dev...

Please sign up or login with your details

Forgot password? Click here to reset