Contrastive and Non-Contrastive Self-Supervised Learning Recover Global and Local Spectral Embedding Methods

05/23/2022
by   Randall Balestriero, et al.
155

Self-Supervised Learning (SSL) surmises that inputs and pairwise positive relationships are enough to learn meaningful representations. Although SSL has recently reached a milestone: outperforming supervised methods in many modalities…the theoretical foundations are limited, method-specific, and fail to provide principled design guidelines to practitioners. In this paper, we propose a unifying framework under the helm of spectral manifold learning to address those limitations. Through the course of this study, we will rigorously demonstrate that VICReg, SimCLR, BarlowTwins et al. correspond to eponymous spectral methods such as Laplacian Eigenmaps, Multidimensional Scaling et al. This unification will then allow us to obtain (i) the closed-form optimal representation for each method, (ii) the closed-form optimal network parameters in the linear regime for each method, (iii) the impact of the pairwise relations used during training on each of those quantities and on downstream task performances, and most importantly, (iv) the first theoretical bridge between contrastive and non-contrastive methods towards global and local spectral embedding methods respectively, hinting at the benefits and limitations of each. For example, (i) if the pairwise relation is aligned with the downstream task, any SSL method can be employed successfully and will recover the supervised method, but in the low data regime, VICReg's invariance hyper-parameter should be high; (ii) if the pairwise relation is misaligned with the downstream task, VICReg with small invariance hyper-parameter should be preferred over SimCLR or BarlowTwins.

READ FULL TEXT

page 5

page 6

page 7

page 15

research
09/29/2022

Joint Embedding Self-Supervised Learning in the Kernel Regime

The fundamental goal of self-supervised learning (SSL) is to produce use...
research
11/27/2022

A Theoretical Study of Inductive Biases in Contrastive Learning

Understanding self-supervised learning is important but challenging. Pre...
research
02/24/2023

Amortised Invariance Learning for Contrastive Self-Supervision

Contrastive self-supervised learning methods famously produce high quali...
research
06/03/2022

On the duality between contrastive and non-contrastive self-supervised learning

Recent approaches in self-supervised learning of image representations c...
research
06/24/2023

Structuring Representation Geometry with Rotationally Equivariant Contrastive Learning

Self-supervised learning converts raw perceptual data such as images to ...
research
09/14/2023

Hodge-Aware Contrastive Learning

Simplicial complexes prove effective in modeling data with multiway depe...

Please sign up or login with your details

Forgot password? Click here to reset