RankMe: Assessing the downstream performance of pretrained self-supervised representations by their rank

10/05/2022
by   Quentin Garrido, et al.
0

Joint-Embedding Self Supervised Learning (JE-SSL) has seen a rapid development, with the emergence of many method variations and few principled guidelines that would help practitioners to successfully deploy those methods. The main reason for that pitfall actually comes from JE-SSL's core principle of not employing any input reconstruction. Without any visual clue, it becomes extremely cryptic to judge the quality of a learned representation without having access to a labelled dataset. We hope to correct those limitations by providing a single – theoretically motivated – criterion that reflects the quality of learned JE-SSL representations: their effective rank. Albeit simple and computationally friendly, this method – coined RankMe – allows one to assess the performance of JE-SSL representations, even on different downstream datasets, without requiring any labels, training or parameters to tune. Through thorough empirical experiments involving hundreds of repeated training episodes, we demonstrate how RankMe can be used for hyperparameter selection with nearly no loss in final performance compared to the current selection method that involve dataset labels. We hope that RankMe will facilitate the use of JE-SSL in domains with little or no labeled data.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/16/2020

Evaluating Self-Supervised Pretraining Without Using Labels

A common practice in unsupervised representation learning is to use labe...
research
12/24/2020

Self-Supervised Representation Learning for Astronomical Images

Sky surveys are the largest data generators in astronomy, making automat...
research
07/03/2023

Don't freeze: Finetune encoders for better Self-Supervised HAR

Recently self-supervised learning has been proposed in the field of huma...
research
06/22/2020

Don't Wait, Just Weight: Improving Unsupervised Representations by Learning Goal-Driven Instance Weights

In the absence of large labelled datasets, self-supervised learning tech...
research
03/03/2023

Towards Democratizing Joint-Embedding Self-Supervised Learning

Joint Embedding Self-Supervised Learning (JE-SSL) has seen rapid develop...
research
06/27/2022

Guillotine Regularization: Improving Deep Networks Generalization by Removing their Head

One unexpected technique that emerged in recent years consists in traini...
research
04/07/2023

Rethinking Evaluation Protocols of Visual Representations Learned via Self-supervised Learning

Linear probing (LP) (and k-NN) on the upstream dataset with labels (e.g....

Please sign up or login with your details

Forgot password? Click here to reset