Run Away From your Teacher: Understanding BYOL by a Novel Self-Supervised Approach

11/22/2020
by   Haizhou Shi, et al.
0

Recently, a newly proposed self-supervised framework Bootstrap Your Own Latent (BYOL) seriously challenges the necessity of negative samples in contrastive learning frameworks. BYOL works like a charm despite the fact that it discards the negative samples completely and there is no measure to prevent collapse in its training objective. In this paper, we suggest understanding BYOL from the view of our proposed interpretable self-supervised learning framework, Run Away From your Teacher (RAFT). RAFT optimizes two objectives at the same time: (i) aligning two views of the same data to similar representations and (ii) running away from the model's Mean Teacher (MT, the exponential moving average of the history models) instead of BYOL's running towards it. The second term of RAFT explicitly prevents the representation collapse and thus makes RAFT a more conceptually reliable framework. We provide basic benchmarks of RAFT on CIFAR10 to validate the effectiveness of our method. Furthermore, we prove that BYOL is equivalent to RAFT under certain conditions, providing solid reasoning for BYOL's counter-intuitive success.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/02/2022

Mix-up Self-Supervised Learning for Contrast-agnostic Applications

Contrastive self-supervised learning has attracted significant research ...
research
05/09/2023

MSVQ: Self-Supervised Learning with Multiple Sample Views and Queues

Self-supervised methods based on contrastive learning have achieved grea...
research
12/10/2021

Concept Representation Learning with Contrastive Self-Supervised Learning

Concept-oriented deep learning (CODL) is a general approach to meet the ...
research
11/17/2022

Self-Supervised Visual Representation Learning via Residual Momentum

Self-supervised learning (SSL) approaches have shown promising capabilit...
research
02/19/2021

Mine Your Own vieW: Self-Supervised Learning Through Across-Sample Prediction

State-of-the-art methods for self-supervised learning (SSL) build repres...
research
10/31/2022

DUEL: Adaptive Duplicate Elimination on Working Memory for Self-Supervised Learning

In Self-Supervised Learning (SSL), it is known that frequent occurrences...
research
10/01/2020

Understanding Self-supervised Learning with Dual Deep Networks

We propose a novel theoretical framework to understand self-supervised l...

Please sign up or login with your details

Forgot password? Click here to reset