Debiased Contrastive Learning

07/01/2020
by   Ching-Yao Chuang, et al.
0

A prominent technique for self-supervised representation learning has been to contrast semantically similar and dissimilar pairs of samples. Without access to labels, dissimilar (negative) points are typically taken to be randomly sampled datapoints, implicitly accepting that these points may, in reality, actually have the same label. Perhaps unsurprisingly, we observe that sampling negative examples from truly different labels improves performance, in a synthetic setting where labels are available. Motivated by this observation, we develop a debiased contrastive objective that corrects for the sampling of same-label datapoints, even without knowledge of the true labels. Empirically, the proposed objective consistently outperforms the state-of-the-art for representation learning in vision, language, and reinforcement learning benchmarks. Theoretically, we establish generalization bounds for the downstream classification task.

READ FULL TEXT

page 1

page 2

page 3

page 4

10/06/2021

Sharp Learning Bounds for Contrastive Unsupervised Representation Learning

Contrastive unsupervised representation learning (CURL) encourages data ...
05/10/2023

Towards Effective Visual Representations for Partial-Label Learning

Under partial-label learning (PLL) where, for each training instance, on...
01/27/2021

Improving Graph Representation Learning by Contrastive Regularization

Graph representation learning is an important task with applications in ...
06/02/2022

Hard Negative Sampling Strategies for Contrastive Representation Learning

One of the challenges in contrastive learning is the selection of approp...
06/18/2021

Investigating the Role of Negatives in Contrastive Representation Learning

Noise contrastive learning is a popular technique for unsupervised repre...
02/28/2022

Resolving label uncertainty with implicit posterior models

We propose a method for jointly inferring labels across a collection of ...
09/29/2022

Understanding Collapse in Non-Contrastive Siamese Representation Learning

Contrastive methods have led a recent surge in the performance of self-s...

Please sign up or login with your details

Forgot password? Click here to reset