Siamese Prototypical Contrastive Learning

08/18/2022
by   Shentong Mo, et al.
0

Contrastive Self-supervised Learning (CSL) is a practical solution that learns meaningful visual representations from massive data in an unsupervised approach. The ordinary CSL embeds the features extracted from neural networks onto specific topological structures. During the training progress, the contrastive loss draws the different views of the same input together while pushing the embeddings from different inputs apart. One of the drawbacks of CSL is that the loss term requires a large number of negative samples to provide better mutual information bound ideally. However, increasing the number of negative samples by larger running batch size also enhances the effects of false negatives: semantically similar samples are pushed apart from the anchor, hence downgrading downstream performance. In this paper, we tackle this problem by introducing a simple but effective contrastive learning framework. The key insight is to employ siamese-style metric loss to match intra-prototype features, while increasing the distance between inter-prototype features. We conduct extensive experiments on various benchmarks where the results demonstrate the effectiveness of our method on improving the quality of visual representations. Specifically, our unsupervised pre-trained ResNet-50 with a linear probe, out-performs the fully-supervised trained version on the ImageNet-1K dataset.

READ FULL TEXT
research
10/18/2022

Rethinking Prototypical Contrastive Learning through Alignment, Uniformity and Correlation

Contrastive self-supervised learning (CSL) with a prototypical regulariz...
research
08/31/2022

Supervised Contrastive Learning with Hard Negative Samples

Unsupervised contrastive learning (UCL) is a self-supervised learning te...
research
03/26/2021

Contrastive Domain Adaptation

Recently, contrastive self-supervised learning has become a key componen...
research
12/15/2020

Understanding the Behaviour of Contrastive Loss

Unsupervised contrastive learning has achieved outstanding success, whil...
research
10/02/2020

Hard Negative Mixing for Contrastive Learning

Contrastive learning has become a key component of self-supervised learn...
research
03/30/2022

How Does SimSiam Avoid Collapse Without Negative Samples? A Unified Understanding with Self-supervised Contrastive Learning

To avoid collapse in self-supervised learning (SSL), a contrastive loss ...
research
06/29/2021

Self-Contrastive Learning

This paper proposes a novel contrastive learning framework, coined as Se...

Please sign up or login with your details

Forgot password? Click here to reset