DarkRank: Accelerating Deep Metric Learning via Cross Sample Similarities Transfer

07/05/2017
by   Yuntao Chen, et al.
0

We have witnessed rapid evolution of deep neural network architecture design in the past years. These latest progresses greatly facilitate the developments in various areas such as computer vision, natural language processing, etc. However, along with the extraordinary performance, these state-of-the-art models also bring in expensive computational cost. Directly deploying these models into applications with real-time requirement is still infeasible. Recently, Hinton etal. have shown that the dark knowledge within a powerful teacher model can significantly help the training of a smaller and faster student network. These knowledge are vastly beneficial to improve the generalization ability of the student model. Inspired by their work, we introduce a new type of knowledge -- cross sample similarities for model compression and acceleration. This knowledge can be naturally derived from deep metric learning model. To transfer them, we bring the learning to rank technique into deep metric learning formulation. We test our proposed DarkRank on the pedestrian re-identification task. The results are quite encouraging. Our DarkRank can improve over the baseline method by a large margin. Moreover, it is fully compatible with other existing methods. When combined, the performance can be further boosted.

READ FULL TEXT

page 2

page 7

research
06/29/2020

Asymmetric metric learning for knowledge transfer

Knowledge transfer from large teacher models to smaller student models h...
research
07/05/2017

Like What You Like: Knowledge Distill via Neuron Selectivity Transfer

Despite deep neural networks have demonstrated extraordinary power in va...
research
02/20/2021

Exploring Knowledge Distillation of a Deep Neural Network for Multi-Script identification

Multi-lingual script identification is a difficult task consisting of di...
research
09/09/2020

Diversified Mutual Learning for Deep Metric Learning

Mutual learning is an ensemble training strategy to improve generalizati...
research
03/28/2018

Adversarial Network Compression

Neural network compression has recently received much attention due to t...
research
12/15/2017

Learning when to skim and when to read

Many recent advances in deep learning for natural language processing ha...

Please sign up or login with your details

Forgot password? Click here to reset