On the duality between contrastive and non-contrastive self-supervised learning

06/03/2022
by   Quentin Garrido, et al.
19

Recent approaches in self-supervised learning of image representations can be categorized into different families of methods and, in particular, can be divided into contrastive and non-contrastive approaches. While differences between the two families have been thoroughly discussed to motivate new approaches, we focus more on the theoretical similarities between them. By designing contrastive and non-contrastive criteria that can be related algebraically and shown to be equivalent under limited assumptions, we show how close those families can be. We further study popular methods and introduce variations of them, allowing us to relate this theoretical result to current practices and show how design choices in the criterion can influence the optimization process and downstream performance. We also challenge the popular assumptions that contrastive and non-contrastive methods, respectively, need large batch sizes and output dimensions. Our theoretical and quantitative results suggest that the numerical gaps between contrastive and noncontrastive methods in certain regimes can be significantly reduced given better network design choice and hyperparameter tuning.

READ FULL TEXT

page 7

page 21

04/28/2021

A Note on Connecting Barlow Twins with Negative-Sample-Free Contrastive Learning

In this report, we relate the algorithmic design of Barlow Twins' method...
10/06/2021

The Power of Contrast for Feature Learning: A Theoretical Analysis

Contrastive learning has achieved state-of-the-art performance in variou...
05/23/2022

Contrastive and Non-Contrastive Self-Supervised Learning Recover Global and Local Spectral Embedding Methods

Self-Supervised Learning (SSL) surmises that inputs and pairwise positiv...
08/12/2022

Contrastive Learning for Object Detection

Contrastive learning is commonly used as a method of self-supervised lea...
07/04/2022

Game State Learning via Game Scene Augmentation

Having access to accurate game state information is of utmost importance...
12/22/2021

Simple and Effective Balance of Contrastive Losses

Contrastive losses have long been a key ingredient of deep metric learni...
06/30/2022

TINC: Temporally Informed Non-Contrastive Learning for Disease Progression Modeling in Retinal OCT Volumes

Recent contrastive learning methods achieved state-of-the-art in low lab...