Supervised Contrastive Learning with Hard Negative Samples

08/31/2022
by   Ruijie Jiang, et al.
0

Unsupervised contrastive learning (UCL) is a self-supervised learning technique that aims to learn a useful representation function by pulling positive samples close to each other while pushing negative samples far apart in the embedding space. To improve the performance of UCL, several works introduced hard-negative unsupervised contrastive learning (H-UCL) that aims to select the "hard" negative samples in contrast to a random sampling strategy used in UCL. In another approach, under the assumption that the label information is available, supervised contrastive learning (SCL) has developed recently by extending the UCL to a fully-supervised setting. In this paper, motivated by the effectiveness of hard-negative sampling strategies in H-UCL and the usefulness of label information in SCL, we propose a contrastive learning framework called hard-negative supervised contrastive learning (H-SCL). Our numerical results demonstrate the effectiveness of H-SCL over both SCL and H-UCL on several image datasets. In addition, we theoretically prove that, under certain conditions, the objective function of H-SCL can be bounded by the objective function of H-UCL but not by the objective function of UCL. Thus, minimizing the H-UCL loss can act as a proxy to minimize the H-SCL loss while minimizing UCL loss cannot. As we numerically showed that H-SCL outperforms other contrastive learning methods, our theoretical result (bounding H-SCL loss by H-UCL loss) helps to explain why H-UCL outperforms UCL in practice.

READ FULL TEXT
research
04/06/2023

Synthetic Hard Negative Samples for Contrastive Learning

Contrastive learning has emerged as an essential approach for self-super...
research
09/15/2023

Supervised Stochastic Neighbor Embedding Using Contrastive Learning

Stochastic neighbor embedding (SNE) methods t-SNE, UMAP are two most pop...
research
02/15/2023

InfoNCE Loss Provably Learns Cluster-Preserving Representations

The goal of contrasting learning is to learn a representation that prese...
research
08/18/2022

Siamese Prototypical Contrastive Learning

Contrastive Self-supervised Learning (CSL) is a practical solution that ...
research
04/13/2021

Understanding Hard Negatives in Noise Contrastive Estimation

The choice of negative examples is important in noise contrastive estima...
research
10/13/2021

Decoupled Contrastive Learning

Contrastive learning (CL) is one of the most successful paradigms for se...
research
06/03/2022

Contrastive learning unifies t-SNE and UMAP

Neighbor embedding methods t-SNE and UMAP are the de facto standard for ...

Please sign up or login with your details

Forgot password? Click here to reset