Self-distillation with Online Diffusion on Batch Manifolds Improves Deep Metric Learning

11/14/2022
by   Zelong Zeng, et al.
0

Recent deep metric learning (DML) methods typically leverage solely class labels to keep positive samples far away from negative ones. However, this type of method normally ignores the crucial knowledge hidden in the data (e.g., intra-class information variation), which is harmful to the generalization of the trained model. To alleviate this problem, in this paper we propose Online Batch Diffusion-based Self-Distillation (OBD-SD) for DML. Specifically, we first propose a simple but effective Progressive Self-Distillation (PSD), which distills the knowledge progressively from the model itself during training. The soft distance targets achieved by PSD can present richer relational information among samples, which is beneficial for the diversity of embedding representations. Then, we extend PSD with an Online Batch Diffusion Process (OBDP), which is to capture the local geometric structure of manifolds in each batch, so that it can reveal the intrinsic relationships among samples in the batch and produce better soft distance targets. Note that our OBDP is able to restore the insufficient manifold relationships obtained by the original PSD and achieve significant performance improvement. Our OBD-SD is a flexible framework that can be integrated into state-of-the-art (SOTA) DML methods. Extensive experiments on various benchmarks, namely CUB200, CARS196, and Stanford Online Products, demonstrate that our OBD-SD consistently improves the performance of the existing DML methods on multiple datasets with negligible additional training time, achieving very competitive results. Code: <https://github.com/ZelongZeng/OBD-SD_Pytorch>

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/17/2022

Improving Generalization of Metric Learning via Listwise Self-distillation

Most deep metric learning (DML) methods employ a strategy that forces al...
research
03/30/2022

Self-Distillation from the Last Mini-Batch for Consistency Regularization

Knowledge distillation (KD) shows a bright promise as a powerful regular...
research
09/17/2020

S2SD: Simultaneous Similarity-based Self-Distillation for Deep Metric Learning

Deep Metric Learning (DML) provides a crucial tool for visual similarity...
research
03/16/2022

Non-isotropy Regularization for Proxy-based Deep Metric Learning

Deep Metric Learning (DML) aims to learn representation spaces on which ...
research
10/26/2020

Fewer is More: A Deep Graph Metric Learning Perspective Using Fewer Proxies

Deep metric learning plays a key role in various machine learning tasks....
research
02/15/2021

Learning Intra-Batch Connections for Deep Metric Learning

The goal of metric learning is to learn a function that maps samples to ...
research
12/31/2022

Disjoint Masking with Joint Distillation for Efficient Masked Image Modeling

Masked image modeling (MIM) has shown great promise for self-supervised ...

Please sign up or login with your details

Forgot password? Click here to reset