Global-Local Self-Distillation for Visual Representation Learning

07/29/2022
by   Tim Lebailly, et al.
0

The downstream accuracy of self-supervised methods is tightly linked to the proxy task solved during training and the quality of the gradients extracted from it. Richer and more meaningful gradients updates are key to allow self-supervised methods to learn better and in a more efficient manner. In a typical self-distillation framework, the representation of two augmented images are enforced to be coherent at the global level. Nonetheless, incorporating local cues in the proxy task can be beneficial and improve the model accuracy on downstream tasks. This leads to a dual objective in which, on the one hand, coherence between global-representations is enforced and on the other, coherence between local-representations is enforced. Unfortunately, an exact correspondence mapping between two sets of local-representations does not exist making the task of matching local-representations from one augmentation to another non-trivial. We propose to leverage the spatial information in the input images to obtain geometric matchings and compare this geometric approach against previous methods based on similarity matchings. Our study shows that not only 1) geometric matchings perform better than similarity based matchings in low-data regimes but also 2) that similarity based matchings are highly hurtful in low-data regimes compared to the vanilla baseline without local self-distillation. The code will be released upon acceptance.

READ FULL TEXT

page 1

page 4

page 8

page 13

page 14

page 15

page 16

page 17

research
03/23/2023

Adaptive Similarity Bootstrapping for Self-Distillation

Most self-supervised methods for representation learning leverage a cros...
research
02/09/2022

Distillation with Contrast is All You Need for Self-Supervised Point Cloud Representation Learning

In this paper, we propose a simple and general framework for self-superv...
research
06/08/2021

Self-supervised Graph-level Representation Learning with Local and Global Structure

This paper studies unsupervised/self-supervised whole-graph representati...
research
03/14/2022

Lead-agnostic Self-supervised Learning for Local and Global Representations of Electrocardiogram

In recent years, self-supervised learning methods have shown significant...
research
07/20/2023

SLPD: Slide-level Prototypical Distillation for WSIs

Improving the feature representation ability is the foundation of many w...
research
05/17/2023

DinoSR: Self-Distillation and Online Clustering for Self-supervised Speech Representation Learning

In this paper, we introduce self-distillation and online clustering for ...
research
06/07/2023

Improving neural network representations using human similarity judgments

Deep neural networks have reached human-level performance on many comput...

Please sign up or login with your details

Forgot password? Click here to reset