On the duality between contrastive and non-contrastive self-supervised learning

by   Quentin Garrido, et al.

Recent approaches in self-supervised learning of image representations can be categorized into different families of methods and, in particular, can be divided into contrastive and non-contrastive approaches. While differences between the two families have been thoroughly discussed to motivate new approaches, we focus more on the theoretical similarities between them. By designing contrastive and non-contrastive criteria that can be related algebraically and shown to be equivalent under limited assumptions, we show how close those families can be. We further study popular methods and introduce variations of them, allowing us to relate this theoretical result to current practices and show how design choices in the criterion can influence the optimization process and downstream performance. We also challenge the popular assumptions that contrastive and non-contrastive methods, respectively, need large batch sizes and output dimensions. Our theoretical and quantitative results suggest that the numerical gaps between contrastive and noncontrastive methods in certain regimes can be significantly reduced given better network design choice and hyperparameter tuning.


page 7

page 21


A Note on Connecting Barlow Twins with Negative-Sample-Free Contrastive Learning

In this report, we relate the algorithmic design of Barlow Twins' method...

The Power of Contrast for Feature Learning: A Theoretical Analysis

Contrastive learning has achieved state-of-the-art performance in variou...

Contrastive and Non-Contrastive Self-Supervised Learning Recover Global and Local Spectral Embedding Methods

Self-Supervised Learning (SSL) surmises that inputs and pairwise positiv...

Contrastive Learning for Object Detection

Contrastive learning is commonly used as a method of self-supervised lea...

Game State Learning via Game Scene Augmentation

Having access to accurate game state information is of utmost importance...

Simple and Effective Balance of Contrastive Losses

Contrastive losses have long been a key ingredient of deep metric learni...

TINC: Temporally Informed Non-Contrastive Learning for Disease Progression Modeling in Retinal OCT Volumes

Recent contrastive learning methods achieved state-of-the-art in low lab...