Ranking Info Noise Contrastive Estimation: Boosting Contrastive Learning via Ranked Positives

01/27/2022
by   David T. Hoffmann, et al.
7

This paper introduces Ranking Info Noise Contrastive Estimation (RINCE), a new member in the family of InfoNCE losses that preserves a ranked ordering of positive samples. In contrast to the standard InfoNCE loss, which requires a strict binary separation of the training pairs into similar and dissimilar samples, RINCE can exploit information about a similarity ranking for learning a corresponding embedding space. We show that the proposed loss function learns favorable embeddings compared to the standard InfoNCE whenever at least noisy ranking information can be obtained or when the definition of positives and negatives is blurry. We demonstrate this for a supervised classification task with additional superclass labels and noisy similarity scores. Furthermore, we show that RINCE can also be applied to unsupervised training with experiments on unsupervised representation learning from videos. In particular, the embedding yields higher classification accuracy, retrieval rates and performs better in out-of-distribution detection than the standard InfoNCE loss.

READ FULL TEXT

page 2

page 7

page 11

page 16

research
04/06/2021

Scene Graph Embeddings Using Relative Similarity Supervision

Scene graphs are a powerful structured representation of the underlying ...
research
10/12/2021

Label-Aware Ranked Loss for robust People Counting using Automotive in-cabin Radar

In this paper, we introduce the Label-Aware Ranked loss, a novel metric ...
research
06/03/2022

Contrastive learning unifies t-SNE and UMAP

Neighbor embedding methods t-SNE and UMAP are the de facto standard for ...
research
03/08/2022

Selective-Supervised Contrastive Learning with Noisy Labels

Deep networks have strong capacities of embedding data into latent repre...
research
01/12/2022

Robust Contrastive Learning against Noisy Views

Contrastive learning relies on an assumption that positive pairs contain...
research
09/12/2022

Hard Negatives or False Negatives: Correcting Pooling Bias in Training Neural Ranking Models

Neural ranking models (NRMs) have become one of the most important techn...
research
12/21/2021

Max-Margin Contrastive Learning

Standard contrastive learning approaches usually require a large number ...

Please sign up or login with your details

Forgot password? Click here to reset