Understanding Contrastive Representation Learning through Alignment and Uniformity on the Hypersphere

05/20/2020
by   Tongzhou Wang, et al.
12

Contrastive representation learning has been outstandingly successful in practice. In this work, we identify two key properties related to the contrastive loss: (1) alignment (closeness) of features from positive pairs, and (2) uniformity of the induced distribution of the (normalized) features on the hypersphere. We prove that, asymptotically, the contrastive loss optimizes these properties, and analyze their positive effects on downstream tasks. Empirically, we introduce an optimizable metric to quantify each property. Extensive experiments on standard vision and language datasets confirm the strong agreement between both metrics and downstream task performance. Remarkably, directly optimizing for these two metrics leads to representations with comparable or better performance at downstream tasks than contrastive learning. Project Page: https://ssnl.github.io/hypersphere Code: https://github.com/SsnL/align_uniform

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/14/2022

Rethinking Minimal Sufficient Representation in Contrastive Learning

Contrastive learning between different views of the data achieves outsta...
research
10/17/2022

Correlation between Alignment-Uniformity and Performance of Dense Contrastive Representations

Recently, dense contrastive learning has shown superior performance on d...
research
09/29/2022

Understanding Collapse in Non-Contrastive Siamese Representation Learning

Contrastive methods have led a recent surge in the performance of self-s...
research
02/01/2022

HCSC: Hierarchical Contrastive Selective Coding

Hierarchical semantic structures naturally exist in an image dataset, in...
research
12/16/2022

Feature Dropout: Revisiting the Role of Augmentations in Contrastive Learning

What role do augmentations play in contrastive learning? Recent work sug...
research
09/14/2022

Jointly Contrastive Representation Learning on Road Network and Trajectory

Road network and trajectory representation learning are essential for tr...
research
11/20/2022

Can Single-Pass Contrastive Learning Work for Both Homophilic and Heterophilic Graph?

Existing graph contrastive learning (GCL) typically requires two forward...

Please sign up or login with your details

Forgot password? Click here to reset