Adaptive Similarity Bootstrapping for Self-Distillation

03/23/2023
by   Tim Lebailly, et al.
0

Most self-supervised methods for representation learning leverage a cross-view consistency objective i.e. they maximize the representation similarity of a given image's augmented views. Recent work NNCLR goes beyond the cross-view paradigm and uses positive pairs from different images obtained via nearest neighbor bootstrapping in a contrastive setting. We empirically show that as opposed to the contrastive learning setting which relies on negative samples, incorporating nearest neighbor bootstrapping in a self-distillation scheme can lead to a performance drop or even collapse. We scrutinize the reason for this unexpected behavior and provide a solution. We propose to adaptively bootstrap neighbors based on the estimated quality of the latent space. We report consistent improvements compared to the naive bootstrapping approach and the original baselines. Our approach leads to performance improvements for various self-distillation method/backbone combinations and standard downstream tasks. Our code will be released upon acceptance.

READ FULL TEXT

page 1

page 3

page 14

page 16

research
04/29/2021

With a Little Help from My Friends: Nearest-Neighbor Contrastive Learning of Visual Representations

Self-supervised learning algorithms based on instance discrimination tra...
research
07/29/2022

Global-Local Self-Distillation for Visual Representation Learning

The downstream accuracy of self-supervised methods is tightly linked to ...
research
06/16/2022

Beyond Supervised vs. Unsupervised: Representative Benchmarking and Analysis of Image Representation Learning

By leveraging contrastive learning, clustering, and other pretext tasks,...
research
03/13/2023

Nearest-Neighbor Inter-Intra Contrastive Learning from Unlabeled Videos

Contrastive learning has recently narrowed the gap between self-supervis...
research
06/07/2022

Extending Momentum Contrast with Cross Similarity Consistency Regularization

Contrastive self-supervised representation learning methods maximize the...
research
03/31/2022

Self-distillation Augmented Masked Autoencoders for Histopathological Image Classification

Self-supervised learning (SSL) has drawn increasing attention in patholo...
research
06/07/2023

ScoreCL: Augmentation-Adaptive Contrastive Learning via Score-Matching Function

Self-supervised contrastive learning (CL) has achieved state-of-the-art ...

Please sign up or login with your details

Forgot password? Click here to reset