Hard Negative Sampling Strategies for Contrastive Representation Learning

06/02/2022
by   Afrina Tabassum, et al.
22

One of the challenges in contrastive learning is the selection of appropriate hard negative examples, in the absence of label information. Random sampling or importance sampling methods based on feature similarity often lead to sub-optimal performance. In this work, we introduce UnReMix, a hard negative sampling strategy that takes into account anchor similarity, model uncertainty and representativeness. Experimental results on several benchmarks show that UnReMix improves negative sample selection, and subsequently downstream performance when compared to state-of-the-art contrastive learning methods.

READ FULL TEXT

page 4

page 8

page 17

page 18

page 19

page 20

research
10/09/2020

Contrastive Learning with Hard Negative Samples

We consider the question: how can you sample good negative examples for ...
research
11/04/2021

Hard Negative Sampling via Regularized Optimal Transport for Contrastive Representation Learning

We study the problem of designing hard negative sampling distributions f...
research
11/08/2022

ConsPrompt: Easily Exploiting Contrastive Samples for Few-shot Prompt Learning

Prompt learning recently become an effective linguistic tool to motivate...
research
10/21/2019

Improving Word Representations: A Sub-sampled Unigram Distribution for Negative Sampling

Word2Vec is the most popular model for word representation and has been ...
research
03/21/2023

Sample4Geo: Hard Negative Sampling For Cross-View Geo-Localisation

Cross-View Geo-Localisation is still a challenging task where additional...
research
05/08/2023

SEGA: Structural Entropy Guided Anchor View for Graph Contrastive Learning

In contrastive learning, the choice of “view” controls the information t...
research
07/01/2020

Debiased Contrastive Learning

A prominent technique for self-supervised representation learning has be...

Please sign up or login with your details

Forgot password? Click here to reset