Self-Supervised Learning by Estimating Twin Class Distributions

by   Feng Wang, et al.

We present TWIST, a novel self-supervised representation learning method by classifying large-scale unlabeled datasets in an end-to-end way. We employ a siamese network terminated by a softmax operation to produce twin class distributions of two augmented images. Without supervision, we enforce the class distributions of different augmentations to be consistent. In the meantime, we regularize the class distributions to make them sharp and diverse. Specifically, we minimize the entropy of the distribution for each sample to make the class prediction for each sample assertive and maximize the entropy of the mean distribution to make the predictions of different samples diverse. In this way, TWIST can naturally avoid the trivial solutions without specific designs such as asymmetric network, stop-gradient operation, or momentum encoder. Different from the clustering-based methods which alternate between clustering and learning, our method is a single learning process guided by a unified loss function. As a result, TWIST outperforms state-of-the-art methods on a wide range of tasks, including unsupervised classification, linear classification, semi-supervised learning, transfer learning, and some dense prediction tasks such as detection and segmentation.



There are no comments yet.


page 2

page 14

page 15

page 18

page 19

page 20


Semi-supervised learning method based on predefined evenly-distributed class centroids

Compared to supervised learning, semi-supervised learning reduces the de...

MoPro: Webly Supervised Learning with Momentum Prototypes

We propose a webly-supervised representation learning method that does n...

Semi-supervised Contrastive Learning with Similarity Co-calibration

Semi-supervised learning acts as an effective way to leverage massive un...

Self-Supervised Classification Network

We present Self-Classifier – a novel self-supervised end-to-end classifi...

Boosting the Performance of Semi-Supervised Learning with Unsupervised Clustering

Recently, Semi-Supervised Learning (SSL) has shown much promise in lever...

Bootstrap Representation Learning for Segmentation on Medical Volumes and Sequences

In this work, we propose a novel straightforward method for medical volu...

AtmoDist: Self-supervised Representation Learning for Atmospheric Dynamics

Representation learning has proven to be a powerful methodology in a wid...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.