DeepAI AI Chat
Log In Sign Up

Self-Supervised Learning by Estimating Twin Class Distributions

10/14/2021
by   Feng Wang, et al.
12

We present TWIST, a novel self-supervised representation learning method by classifying large-scale unlabeled datasets in an end-to-end way. We employ a siamese network terminated by a softmax operation to produce twin class distributions of two augmented images. Without supervision, we enforce the class distributions of different augmentations to be consistent. In the meantime, we regularize the class distributions to make them sharp and diverse. Specifically, we minimize the entropy of the distribution for each sample to make the class prediction for each sample assertive and maximize the entropy of the mean distribution to make the predictions of different samples diverse. In this way, TWIST can naturally avoid the trivial solutions without specific designs such as asymmetric network, stop-gradient operation, or momentum encoder. Different from the clustering-based methods which alternate between clustering and learning, our method is a single learning process guided by a unified loss function. As a result, TWIST outperforms state-of-the-art methods on a wide range of tasks, including unsupervised classification, linear classification, semi-supervised learning, transfer learning, and some dense prediction tasks such as detection and segmentation.

READ FULL TEXT

page 2

page 14

page 15

page 18

page 19

page 20

01/13/2020

Semi-supervised learning method based on predefined evenly-distributed class centroids

Compared to supervised learning, semi-supervised learning reduces the de...
09/17/2020

MoPro: Webly Supervised Learning with Momentum Prototypes

We propose a webly-supervised representation learning method that does n...
05/16/2021

Semi-supervised Contrastive Learning with Similarity Co-calibration

Semi-supervised learning acts as an effective way to leverage massive un...
03/19/2021

Self-Supervised Classification Network

We present Self-Classifier – a novel self-supervised end-to-end classifi...
05/31/2023

Additional Positive Enables Better Representation Learning for Medical Images

This paper presents a new way to identify additional positive pairs for ...
03/16/2022

Relational Self-Supervised Learning

Self-supervised Learning (SSL) including the mainstream contrastive lear...
06/23/2021

Bootstrap Representation Learning for Segmentation on Medical Volumes and Sequences

In this work, we propose a novel straightforward method for medical volu...