Rethinking Prototypical Contrastive Learning through Alignment, Uniformity and Correlation

10/18/2022
by   Shentong Mo, et al.
0

Contrastive self-supervised learning (CSL) with a prototypical regularization has been introduced in learning meaningful representations for downstream tasks that require strong semantic information. However, to optimize CSL with a loss that performs the prototypical regularization aggressively, e.g., the ProtoNCE loss, might cause the "coagulation" of examples in the embedding space. That is, the intra-prototype diversity of samples collapses to trivial solutions for their prototype being well-separated from others. Motivated by previous works, we propose to mitigate this phenomenon by learning Prototypical representation through Alignment, Uniformity and Correlation (PAUC). Specifically, the ordinary ProtoNCE loss is revised with: (1) an alignment loss that pulls embeddings from positive prototypes together; (2) a uniformity loss that distributes the prototypical level features uniformly; (3) a correlation loss that increases the diversity and discriminability between prototypical level features. We conduct extensive experiments on various benchmarks where the results demonstrate the effectiveness of our method in improving the quality of prototypical contrastive representations. Particularly, in the classification down-stream tasks with linear probes, our proposed method outperforms the state-of-the-art instance-wise and prototypical contrastive learning methods on the ImageNet-100 dataset by 2.96 the same settings of batch size and epochs.

READ FULL TEXT

page 1

page 2

research
08/18/2022

Siamese Prototypical Contrastive Learning

Contrastive Self-supervised Learning (CSL) is a practical solution that ...
research
05/26/2022

Triangular Contrastive Learning on Molecular Graphs

Recent contrastive learning methods have shown to be effective in variou...
research
12/01/2022

CL4CTR: A Contrastive Learning Framework for CTR Prediction

Many Click-Through Rate (CTR) prediction works focused on designing adva...
research
08/18/2023

Point Contrastive Prediction with Semantic Clustering for Self-Supervised Learning on Point Cloud Videos

We propose a unified point cloud video self-supervised learning framewor...
research
05/28/2023

Whitening-based Contrastive Learning of Sentence Embeddings

This paper presents a whitening-based contrastive learning method for se...
research
05/01/2023

A Simplified Framework for Contrastive Learning for Node Representations

Contrastive learning has recently established itself as a powerful self-...
research
04/29/2021

Hyperspherically Regularized Networks for BYOL Improves Feature Uniformity and Separability

Bootstrap Your Own Latent (BYOL) introduced an approach to self-supervis...

Please sign up or login with your details

Forgot password? Click here to reset