ProxylessKD: Direct Knowledge Distillation with Inherited Classifier for Face Recognition

10/31/2020
by   Weidong Shi, et al.
0

Knowledge Distillation (KD) refers to transferring knowledge from a large model to a smaller one, which is widely used to enhance model performance in machine learning. It tries to align embedding spaces generated from the teacher and the student model (i.e. to make images corresponding to the same semantics share the same embedding across different models). In this work, we focus on its application in face recognition. We observe that existing knowledge distillation models optimize the proxy tasks that force the student to mimic the teacher's behavior, instead of directly optimizing the face recognition accuracy. Consequently, the obtained student models are not guaranteed to be optimal on the target task or able to benefit from advanced constraints, such as large margin constraints (e.g. margin-based softmax). We then propose a novel method named ProxylessKD that directly optimizes face recognition accuracy by inheriting the teacher's classifier as the student's classifier to guide the student to learn discriminative embeddings in the teacher's embedding space. The proposed ProxylessKD is very easy to implement and sufficiently generic to be extended to other tasks beyond face recognition. We conduct extensive experiments on standard face recognition benchmarks, and the results demonstrate that ProxylessKD achieves superior performance over existing knowledge distillation methods.

READ FULL TEXT
research
04/12/2022

CoupleFace: Relation Matters for Face Recognition Distillation

Knowledge distillation is an effective method to improve the performance...
research
06/06/2022

Evaluation-oriented Knowledge Distillation for Deep Face Recognition

Knowledge distillation (KD) is a widely-used technique that utilizes lar...
research
06/26/2023

Cross Architecture Distillation for Face Recognition

Transformers have emerged as the superior choice for face recognition ta...
research
03/30/2023

Asymmetric Face Recognition with Cross Model Compatible Ensembles

The asymmetrical retrieval setting is a well suited solution for resourc...
research
03/05/2020

MarginDistillation: distillation for margin-based softmax

The usage of convolutional neural networks (CNNs) in conjunction with a ...
research
02/10/2020

Distribution Distillation Loss: Generic Approach for Improving Face Recognition from Hard Samples

Large facial variations are the main challenge in face recognition. To t...
research
04/10/2023

Grouped Knowledge Distillation for Deep Face Recognition

Compared with the feature-based distillation methods, logits distillatio...

Please sign up or login with your details

Forgot password? Click here to reset