Constrained Mean Shift for Representation Learning

10/19/2021
by   Ajinkya Tejankar, et al.
6

We are interested in representation learning from labeled or unlabeled data. Inspired by recent success of self-supervised learning (SSL), we develop a non-contrastive representation learning method that can exploit additional knowledge. This additional knowledge may come from annotated labels in the supervised setting or an SSL model from another modality in the SSL setting. Our main idea is to generalize the mean-shift algorithm by constraining the search space of nearest neighbors, resulting in semantically purer representations. Our method simply pulls the embedding of an instance closer to its nearest neighbors in a search space that is constrained using the additional knowledge. By leveraging this non-contrastive loss, we show that the supervised ImageNet-1k pretraining with our method results in better transfer performance as compared to the baselines. Further, we demonstrate that our method is relatively robust to label noise. Finally, we show that it is possible to use the noisy constraint across modalities to train self-supervised video models.

READ FULL TEXT
research
12/08/2021

Constrained Mean Shift Using Distant Yet Related Neighbors for Representation Learning

We are interested in representation learning in self-supervised, supervi...
research
01/12/2021

Estimating Galactic Distances From Images Using Self-supervised Representation Learning

We use a contrastive self-supervised learning framework to estimate dist...
research
12/21/2022

Similarity Contrastive Estimation for Image and Video Soft Contrastive Self-Supervised Learning

Contrastive representation learning has proven to be an effective self-s...
research
05/15/2021

Mean Shift for Self-Supervised Learning

Most recent self-supervised learning (SSL) algorithms learn features by ...
research
11/17/2020

Can Semantic Labels Assist Self-Supervised Visual Representation Learning?

Recently, contrastive learning has largely advanced the progress of unsu...
research
04/11/2022

Speech Sequence Embeddings using Nearest Neighbors Contrastive Learning

We introduce a simple neural encoder architecture that can be trained us...
research
10/20/2022

Does Decentralized Learning with Non-IID Unlabeled Data Benefit from Self Supervision?

Decentralized learning has been advocated and widely deployed to make ef...

Please sign up or login with your details

Forgot password? Click here to reset