DeepAI
Log In Sign Up

Hard Negative Sampling Strategies for Contrastive Representation Learning

06/02/2022
by   Afrina Tabassum, et al.
22

One of the challenges in contrastive learning is the selection of appropriate hard negative examples, in the absence of label information. Random sampling or importance sampling methods based on feature similarity often lead to sub-optimal performance. In this work, we introduce UnReMix, a hard negative sampling strategy that takes into account anchor similarity, model uncertainty and representativeness. Experimental results on several benchmarks show that UnReMix improves negative sample selection, and subsequently downstream performance when compared to state-of-the-art contrastive learning methods.

READ FULL TEXT

page 4

page 8

page 17

page 18

page 19

page 20

10/09/2020

Contrastive Learning with Hard Negative Samples

We consider the question: how can you sample good negative examples for ...
11/04/2021

Hard Negative Sampling via Regularized Optimal Transport for Contrastive Representation Learning

We study the problem of designing hard negative sampling distributions f...
11/08/2022

ConsPrompt: Easily Exploiting Contrastive Samples for Few-shot Prompt Learning

Prompt learning recently become an effective linguistic tool to motivate...
10/21/2019

Improving Word Representations: A Sub-sampled Unigram Distribution for Negative Sampling

Word2Vec is the most popular model for word representation and has been ...
11/08/2022

On Negative Sampling for Contrastive Audio-Text Retrieval

This paper investigates negative sampling for contrastive learning in th...
06/01/2022

Negative Sampling for Contrastive Representation Learning: A Review

The learn-to-compare paradigm of contrastive representation learning (CR...
06/18/2021

Investigating the Role of Negatives in Contrastive Representation Learning

Noise contrastive learning is a popular technique for unsupervised repre...