Image-to-Video Re-Identification via Mutual Discriminative Knowledge Transfer

01/21/2022
by   Pichao Wang, et al.
4

The gap in representations between image and video makes Image-to-Video Re-identification (I2V Re-ID) challenging, and recent works formulate this problem as a knowledge distillation (KD) process. In this paper, we propose a mutual discriminative knowledge distillation framework to transfer a video-based richer representation to an image based representation more effectively. Specifically, we propose the triplet contrast loss (TCL), a novel loss designed for KD. During the KD process, the TCL loss transfers the local structure, exploits the higher order information, and mitigates the misalignment of the heterogeneous output of teacher and student networks. Compared with other losses for KD, the proposed TCL loss selectively transfers the local discriminative features from teacher to student, making it effective in the ReID. Besides the TCL loss, we adopt mutual learning to regularize both the teacher and student networks training. Extensive experiments demonstrate the effectiveness of our method on the MARS, DukeMTMC-VideoReID and VeRi-776 benchmarks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/01/2021

Revisiting Knowledge Distillation: An Inheritance and Exploration Framework

Knowledge Distillation (KD) is a popular technique to transfer knowledge...
research
12/01/2020

Multi-level Knowledge Distillation

Knowledge distillation has become an important technique for model compr...
research
04/25/2022

Proto2Proto: Can you recognize the car, the way I do?

Prototypical methods have recently gained a lot of attention due to thei...
research
01/15/2020

Uncertainty-Aware Multi-Shot Knowledge Distillation for Image-Based Object Re-Identification

Object re-identification (re-id) aims to identify a specific object acro...
research
08/21/2018

Text-to-image Synthesis via Symmetrical Distillation Networks

Text-to-image synthesis aims to automatically generate images according ...
research
07/08/2020

Robust Re-Identification by Multiple Views Knowledge Distillation

To achieve robustness in Re-Identification, standard methods leverage tr...
research
09/23/2022

Descriptor Distillation: a Teacher-Student-Regularized Framework for Learning Local Descriptors

Learning a fast and discriminative patch descriptor is a challenging top...

Please sign up or login with your details

Forgot password? Click here to reset