A Note on Connecting Barlow Twins with Negative-Sample-Free Contrastive Learning

04/28/2021
by   Yao-Hung Hubert Tsai, et al.
8

In this report, we relate the algorithmic design of Barlow Twins' method to the Hilbert-Schmidt Independence Criterion (HSIC), thus establishing it as a contrastive learning approach that is free of negative samples. Through this perspective, we argue that Barlow Twins (and thus the class of negative-sample-free contrastive learning methods) suggests a possibility to bridge the two major families of self-supervised learning philosophies: non-contrastive and contrastive approaches. In particular, Barlow twins exemplified how we could combine the best practices of both worlds: avoiding the need of large training batch size and negative sample pairing (like non-contrastive methods) and avoiding symmetry-breaking network designs (like contrastive methods).

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/03/2022

On the duality between contrastive and non-contrastive self-supervised learning

Recent approaches in self-supervised learning of image representations c...
research
05/27/2023

Kernel-SSL: Kernel KL Divergence for Self-Supervised Learning

Contrastive learning usually compares one positive anchor sample with lo...
research
10/05/2020

EqCo: Equivalent Rules for Self-supervised Contrastive Learning

In this paper, we propose a method, named EqCo (Equivalent Rules for Con...
research
08/10/2022

Non-Contrastive Self-Supervised Learning of Utterance-Level Speech Representations

Considering the abundance of unlabeled speech data and the high labeling...
research
10/20/2020

BYOL works even without batch statistics

Bootstrap Your Own Latent (BYOL) is a self-supervised learning approach ...
research
03/30/2022

Dual Temperature Helps Contrastive Learning Without Many Negative Samples: Towards Understanding and Simplifying MoCo

Contrastive learning (CL) is widely known to require many negative sampl...
research
03/30/2022

How Does SimSiam Avoid Collapse Without Negative Samples? A Unified Understanding with Self-supervised Contrastive Learning

To avoid collapse in self-supervised learning (SSL), a contrastive loss ...

Please sign up or login with your details

Forgot password? Click here to reset