Learning the Relation between Similarity Loss and Clustering Loss in Self-Supervised Learning

01/08/2023
by   Jidong Ge, et al.
0

Self-supervised learning enables networks to learn discriminative features from massive data itself. Most state-of-the-art methods maximize the similarity between two augmentations of one image based on contrastive learning. By utilizing the consistency of two augmentations, the burden of manual annotations can be freed. Contrastive learning exploits instance-level information to learn robust features. However, the learned information is probably confined to different views of the same instance. In this paper, we attempt to leverage the similarity between two distinct images to boost representation in self-supervised learning. In contrast to instance-level information, the similarity between two distinct images may provide more useful information. Besides, we analyze the relation between similarity loss and feature-level cross-entropy loss. These two losses are essential for most deep learning methods. However, the relation between these two losses is not clear. Similarity loss helps obtain instance-level representation, while feature-level cross-entropy loss helps mine the similarity between two distinct images. We provide theoretical analyses and experiments to show that a suitable combination of these two losses can get state-of-the-art results.

READ FULL TEXT

page 1

page 13

research
05/16/2021

Semi-supervised Contrastive Learning with Similarity Co-calibration

Semi-supervised learning acts as an effective way to leverage massive un...
research
09/05/2022

Supervised Contrastive Learning to Classify Paranasal Anomalies in the Maxillary Sinus

Using deep learning techniques, anomalies in the paranasal sinus system ...
research
11/02/2022

Beyond Instance Discrimination: Relation-aware Contrastive Self-supervised Learning

Contrastive self-supervised learning (CSL) based on instance discriminat...
research
03/16/2022

Relational Self-Supervised Learning

Self-supervised Learning (SSL) including the mainstream contrastive lear...
research
02/25/2022

Refining Self-Supervised Learning in Imaging: Beyond Linear Metric

We introduce in this paper a new statistical perspective, exploiting the...
research
02/17/2021

Dissecting Supervised Constrastive Learning

Minimizing cross-entropy over the softmax scores of a linear map compose...
research
12/22/2021

Simple and Effective Balance of Contrastive Losses

Contrastive losses have long been a key ingredient of deep metric learni...

Please sign up or login with your details

Forgot password? Click here to reset