Unsupervised Learning on a DIET: Datum IndEx as Target Free of Self-Supervision, Reconstruction, Projector Head

02/20/2023
by   Randall Balestriero, et al.
0

Costly, noisy, and over-specialized, labels are to be set aside in favor of unsupervised learning if we hope to learn cheap, reliable, and transferable models. To that end, spectral embedding, self-supervised learning, or generative modeling have offered competitive solutions. Those methods however come with numerous challenges e.g. estimating geodesic distances, specifying projector architectures and anti-collapse losses, or specifying decoder architectures and reconstruction losses. In contrast, we introduce a simple explainable alternative – coined DIET – to learn representations from unlabeled data, free of those challenges. DIET is blatantly simple: take one's favorite classification setup and use the Datum IndEx as its Target class, i.e. each sample is its own class, no further changes needed. DIET works without a decoder/projector network, is not based on positive pairs nor reconstruction, introduces no hyper-parameters, and works out-of-the-box across datasets and architectures. Despite DIET's simplicity, the learned representations are of high-quality and often on-par with the state-of-the-art e.g. using a linear classifier on top of DIET's learned representation reaches 71.4% on CIFAR100 with a Resnet101, 52.5% on TinyImagenet with a Resnext50.

READ FULL TEXT

page 3

page 7

page 14

page 16

research
06/17/2019

Boosting Supervision with Self-Supervision for Few-shot Learning

We present a technique to improve the transferability of deep representa...
research
11/27/2018

Self-Supervised Generative Adversarial Networks

Conditional GANs are at the forefront of natural image synthesis. The ma...
research
11/20/2022

Joint Embedding Predictive Architectures Focus on Slow Features

Many common methods for learning a world model for pixel-based environme...
research
03/19/2021

Self-Supervised Classification Network

We present Self-Classifier – a novel self-supervised end-to-end classifi...
research
12/01/2020

Boosting the Performance of Semi-Supervised Learning with Unsupervised Clustering

Recently, Semi-Supervised Learning (SSL) has shown much promise in lever...
research
03/03/2023

Towards Democratizing Joint-Embedding Self-Supervised Learning

Joint Embedding Self-Supervised Learning (JE-SSL) has seen rapid develop...
research
12/14/2020

A Perturbation Resilient Framework for Unsupervised Learning

Designing learning algorithms that are resistant to perturbations of the...

Please sign up or login with your details

Forgot password? Click here to reset