Understanding Collapse in Non-Contrastive Siamese Representation Learning

09/29/2022
by   Alexander C. Li, et al.
39

Contrastive methods have led a recent surge in the performance of self-supervised representation learning (SSL). Recent methods like BYOL or SimSiam purportedly distill these contrastive methods down to their essence, removing bells and whistles, including the negative examples, that do not contribute to downstream performance. These "non-contrastive" methods work surprisingly well without using negatives even though the global minimum lies at trivial collapse. We empirically analyze these non-contrastive methods and find that SimSiam is extraordinarily sensitive to dataset and model size. In particular, SimSiam representations undergo partial dimensional collapse if the model is too small relative to the dataset size. We propose a metric to measure the degree of this collapse and show that it can be used to forecast the downstream task performance without any fine-tuning or labels. We further analyze architectural design choices and their effect on the downstream performance. Finally, we demonstrate that shifting to a continual learning setting acts as a regularizer and prevents collapse, and a hybrid between continual and multi-epoch training can improve linear probe accuracy by as many as 18 percentage points using ResNet-18 on ImageNet. Our project page is at https://alexanderli.com/noncontrastive-ssl/.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/20/2020

Understanding Contrastive Representation Learning through Alignment and Uniformity on the Hypersphere

Contrastive representation learning has been outstandingly successful in...
research
06/08/2023

Sy-CON: Symmetric Contrastive Loss for Continual Self-Supervised Representation Learning

We introduce a novel and general loss function, called Symmetric Contras...
research
02/07/2022

Crafting Better Contrastive Views for Siamese Representation Learning

Recent self-supervised contrastive learning methods greatly benefit from...
research
10/11/2021

Towards Demystifying Representation Learning with Non-contrastive Self-supervision

Non-contrastive methods of self-supervised learning (such as BYOL and Si...
research
07/01/2020

Debiased Contrastive Learning

A prominent technique for self-supervised representation learning has be...
research
12/25/2020

Taxonomy of multimodal self-supervised representation learning

Sensory input from multiple sources is crucial for robust and coherent h...
research
06/23/2023

Patch-Level Contrasting without Patch Correspondence for Accurate and Dense Contrastive Representation Learning

We propose ADCLR: A ccurate and D ense Contrastive Representation Learni...

Please sign up or login with your details

Forgot password? Click here to reset