ISD: Self-Supervised Learning by Iterative Similarity Distillation

12/16/2020
by   Ajinkya Tejankar, et al.
43

Recently, contrastive learning has achieved great results in self-supervised learning, where the main idea is to push two augmentations of an image (positive pairs) closer compared to other random images (negative pairs). We argue that not all random images are equal. Hence, we introduce a self supervised learning algorithm where we use a soft similarity for the negative images rather than a binary distinction between positive and negative pairs. We iteratively distill a slowly evolving teacher model to the student model by capturing the similarity of a query image to some random images and transferring that knowledge to the student. We argue that our method is less constrained compared to recent contrastive learning methods, so it can learn better features. Specifically, our method should handle unbalanced and unlabeled data better than existing contrastive learning methods, because the randomly chosen negative set might include many samples that are semantically similar to the query image. In this case, our method labels them as highly similar while standard contrastive methods label them as negative pairs. Our method achieves better results compared to state-of-the-art models like BYOL and MoCo on transfer learning settings. We also show that our method performs better in the settings where the unlabeled data is unbalanced. Our code is available here: https://github.com/UMBCvision/ISD.

READ FULL TEXT

page 2

page 3

page 8

research
05/09/2023

MSVQ: Self-Supervised Learning with Multiple Sample Views and Queues

Self-supervised methods based on contrastive learning have achieved grea...
research
01/05/2023

Learning by Sorting: Self-supervised Learning with Group Ordering Constraints

Contrastive learning has become a prominent ingredient in learning repre...
research
09/28/2022

Non-contrastive approaches to similarity learning: positive examples are all you need

The similarity learning problem in the oil & gas industry aims to constr...
research
10/27/2021

Robust Contrastive Learning Using Negative Samples with Diminished Semantics

Unsupervised learning has recently made exceptional progress because of ...
research
10/17/2022

Improving Contrastive Learning on Visually Homogeneous Mars Rover Images

Contrastive learning has recently demonstrated superior performance to s...
research
04/20/2021

SelfReg: Self-supervised Contrastive Regularization for Domain Generalization

In general, an experimental environment for deep learning assumes that t...
research
07/04/2021

Bag of Instances Aggregation Boosts Self-supervised Learning

Recent advances in self-supervised learning have experienced remarkable ...

Please sign up or login with your details

Forgot password? Click here to reset