Tuned Contrastive Learning

05/18/2023
by   Chaitanya Animesh, et al.
0

In recent times, contrastive learning based loss functions have become increasingly popular for visual self-supervised representation learning owing to their state-of-the-art (SOTA) performance. Most of the modern contrastive learning loss functions like SimCLR are Info-NCE based and generalize only to one positive and multiple negatives per anchor. A recent state-of-the-art, supervised contrastive (SupCon) loss, extends self-supervised contrastive learning to supervised setting by generalizing to multiple positives and multiple negatives in a batch and improves upon the cross-entropy loss. In this paper, we propose a novel contrastive loss function - Tuned Contrastive Learning (TCL) loss, that generalizes to multiple positives and multiple negatives within a batch and offers parameters to tune and improve the gradient responses from hard positives and hard negatives. We provide theoretical analysis of our loss function's gradient response and show mathematically how it is better than that of SupCon loss. Empirically, we compare our loss function with SupCon loss and cross-entropy loss in a supervised setting on multiple classification-task datasets. We also show the stability of our loss function to various hyper-parameter settings. Finally, we compare TCL with various SOTA self-supervised learning methods and show that our loss function achieves performance on par with SOTA methods in both supervised and self-supervised settings.

READ FULL TEXT
research
06/01/2022

A Generalized Supervised Contrastive Learning Framework

Based on recent remarkable achievements of contrastive learning in self-...
research
12/02/2021

Probabilistic Contrastive Loss for Self-Supervised Learning

This paper proposes a probabilistic contrastive loss function for self-s...
research
10/13/2021

Decoupled Contrastive Learning

Contrastive learning (CL) is one of the most successful paradigms for se...
research
04/13/2021

Understanding Hard Negatives in Noise Contrastive Estimation

The choice of negative examples is important in noise contrastive estima...
research
12/22/2021

Simple and Effective Balance of Contrastive Losses

Contrastive losses have long been a key ingredient of deep metric learni...
research
06/03/2022

Rethinking and Scaling Up Graph Contrastive Learning: An Extremely Efficient Approach with Group Discrimination

Graph contrastive learning (GCL) alleviates the heavy reliance on label ...
research
09/25/2020

G-SimCLR : Self-Supervised Contrastive Learning with Guided Projection via Pseudo Labelling

In the realms of computer vision, it is evident that deep neural network...

Please sign up or login with your details

Forgot password? Click here to reset