Similarity Contrastive Estimation for Self-Supervised Soft Contrastive Learning

11/29/2021
by   Julien Denize, et al.
0

Contrastive representation learning has proven to be an effective self-supervised learning method. Most successful approaches are based on the Noise Contrastive Estimation (NCE) paradigm and consider different views of an instance as positives and other instances as noise that positives should be contrasted with. However, all instances in a dataset are drawn from the same distribution and share underlying semantic information that should not be considered as noise. We argue that a good data representation contains the relations, or semantic similarity, between the instances. Contrastive learning implicitly learns relations but considers the negatives as noise which is harmful to the quality of the learned relations and therefore the quality of the representation. To circumvent this issue we propose a novel formulation of contrastive learning using semantic similarity between instances called Similarity Contrastive Estimation (SCE). Our training objective can be considered as soft contrastive learning. Instead of hard classifying positives and negatives, we propose a continuous distribution to push or pull instances based on their semantic similarities. The target similarity distribution is computed from weak augmented instances and sharpened to eliminate irrelevant relations. Each weak augmented instance is paired with a strong augmented instance that contrasts its positive while maintaining the target similarity distribution. Experimental results show that our proposed SCE outperforms its baselines MoCov2 and ReSSL on various datasets and is competitive with state-of-the-art algorithms on the ImageNet linear evaluation protocol.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/21/2022

Similarity Contrastive Estimation for Image and Video Soft Contrastive Self-Supervised Learning

Contrastive representation learning has proven to be an effective self-s...
research
11/03/2022

Self-Adapting Noise-Contrastive Estimation for Energy-Based Models

Training energy-based models (EBMs) with noise-contrastive estimation (N...
research
08/06/2022

Contrastive Positive Mining for Unsupervised 3D Action Representation Learning

Recent contrastive based 3D action representation learning has made grea...
research
06/28/2021

A Theory-Driven Self-Labeling Refinement Method for Contrastive Representation Learning

For an image query, unsupervised contrastive learning labels crops of th...
research
03/16/2022

Relational Self-Supervised Learning

Self-supervised Learning (SSL) including the mainstream contrastive lear...
research
03/15/2022

Improving Event Representation via Simultaneous Weakly Supervised Contrastive Learning and Clustering

Representations of events described in text are important for various ta...
research
03/22/2023

MaskCon: Masked Contrastive Learning for Coarse-Labelled Dataset

Deep learning has achieved great success in recent years with the aid of...

Please sign up or login with your details

Forgot password? Click here to reset