Probabilistic Contrastive Loss for Self-Supervised Learning

12/02/2021
by   Shen Li, et al.
18

This paper proposes a probabilistic contrastive loss function for self-supervised learning. The well-known contrastive loss is deterministic and involves a temperature hyperparameter that scales the inner product between two normed feature embeddings. By reinterpreting the temperature hyperparameter as a quantity related to the radius of the hypersphere, we derive a new loss function that involves a confidence measure which quantifies uncertainty in a mathematically grounding manner. Some intriguing properties of the proposed loss function are empirically demonstrated, which agree with human-like predictions. We believe the present work brings up a new prospective to the area of contrastive learning.

READ FULL TEXT

page 1

page 2

page 3

research
05/18/2023

Tuned Contrastive Learning

In recent times, contrastive learning based loss functions have become i...
research
11/10/2022

Self-supervised learning of audio representations using angular contrastive loss

In Self-Supervised Learning (SSL), various pretext tasks are designed fo...
research
10/08/2021

Temperature as Uncertainty in Contrastive Learning

Contrastive learning has demonstrated great capability to learn represen...
research
05/26/2020

BHN: A Brain-like Heterogeneous Network

The human brain works in an unsupervised way, and more than one brain re...
research
05/19/2023

Not All Semantics are Created Equal: Contrastive Self-supervised Learning with Automatic Temperature Individualization

In this paper, we aim to optimize a contrastive loss with individualized...
research
06/08/2023

Sy-CON: Symmetric Contrastive Loss for Continual Self-Supervised Representation Learning

We introduce a novel and general loss function, called Symmetric Contras...
research
06/29/2021

Self-Contrastive Learning

This paper proposes a novel contrastive learning framework, coined as Se...

Please sign up or login with your details

Forgot password? Click here to reset