CoupleFace: Relation Matters for Face Recognition Distillation

04/12/2022
by   Jiaheng Liu, et al.
0

Knowledge distillation is an effective method to improve the performance of a lightweight neural network (i.e., student model) by transferring the knowledge of a well-performed neural network (i.e., teacher model), which has been widely applied in many computer vision tasks, including face recognition. Nevertheless, the current face recognition distillation methods usually utilize the Feature Consistency Distillation (FCD) (e.g., L2 distance) on the learned embeddings extracted by the teacher and student models for each sample, which is not able to fully transfer the knowledge from the teacher to the student for face recognition. In this work, we observe that mutual relation knowledge between samples is also important to improve the discriminative ability of the learned representation of the student model, and propose an effective face recognition distillation method called CoupleFace by additionally introducing the Mutual Relation Distillation (MRD) into existing distillation framework. Specifically, in MRD, we first propose to mine the informative mutual relations, and then introduce the Relation-Aware Distillation (RAD) loss to transfer the mutual relation knowledge of the teacher model to the student model. Extensive experimental results on multiple benchmark datasets demonstrate the effectiveness of our proposed CoupleFace for face recognition. Moreover, based on our proposed CoupleFace, we have won the first place in the ICCV21 Masked Face Recognition Challenge (MS1M track).

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/31/2020

ProxylessKD: Direct Knowledge Distillation with Inherited Classifier for Face Recognition

Knowledge Distillation (KD) refers to transferring knowledge from a larg...
research
03/29/2021

Complementary Relation Contrastive Distillation

Knowledge distillation aims to transfer representation ability from a te...
research
09/09/2017

Model Distillation with Knowledge Transfer from Face Classification to Alignment and Verification

Knowledge distillation is a potential solution for model compression. Th...
research
04/10/2023

Grouped Knowledge Distillation for Deep Face Recognition

Compared with the feature-based distillation methods, logits distillatio...
research
06/03/2019

Deep Face Recognition Model Compression via Knowledge Transfer and Distillation

Fully convolutional networks (FCNs) have become de facto tool to achieve...
research
03/05/2020

MarginDistillation: distillation for margin-based softmax

The usage of convolutional neural networks (CNNs) in conjunction with a ...
research
10/11/2019

VarGFaceNet: An Efficient Variable Group Convolutional Neural Network for Lightweight Face Recognition

To improve the discriminative and generalization ability of lightweight ...

Please sign up or login with your details

Forgot password? Click here to reset