A Cookbook of Self-Supervised Learning

04/24/2023
by   Randall Balestriero, et al.
0

Self-supervised learning, dubbed the dark matter of intelligence, is a promising path to advance machine learning. Yet, much like cooking, training SSL methods is a delicate art with a high barrier to entry. While many components are familiar, successfully training a SSL method involves a dizzying set of choices from the pretext tasks to training hyper-parameters. Our goal is to lower the barrier to entry into SSL research by laying the foundations and latest SSL recipes in the style of a cookbook. We hope to empower the curious researcher to navigate the terrain of methods, understand the role of the various knobs, and gain the know-how required to explore how delicious SSL can be.

READ FULL TEXT
research
11/15/2022

Homomorphic Self-Supervised Learning

In this work, we observe that many existing self-supervised learning alg...
research
04/05/2021

An Empirical Study of Training Self-Supervised Vision Transformers

This paper does not describe a novel method. Instead, it studies a strai...
research
12/11/2022

Accelerating Self-Supervised Learning via Efficient Training Strategies

Recently the focus of the computer vision community has shifted from exp...
research
09/05/2021

Re-entry Prediction for Online Conversations via Self-Supervised Learning

In recent years, world business in online discussions and opinion sharin...
research
06/09/2021

Self-supervised Feature Enhancement: Applying Internal Pretext Task to Supervised Learning

Traditional self-supervised learning requires CNNs using external pretex...
research
02/03/2023

Blockwise Self-Supervised Learning at Scale

Current state-of-the-art deep networks are all powered by backpropagatio...
research
01/13/2021

Self-Supervised Vessel Enhancement Using Flow-Based Consistencies

Vessel segmenting is an essential task in many clinical applications. Al...

Please sign up or login with your details

Forgot password? Click here to reset