Teaching Where to Look: Attention Similarity Knowledge Distillation for Low Resolution Face Recognition

09/29/2022
by   Sungho Shin, et al.
0

Deep learning has achieved outstanding performance for face recognition benchmarks, but performance reduces significantly for low resolution (LR) images. We propose an attention similarity knowledge distillation approach, which transfers attention maps obtained from a high resolution (HR) network as a teacher into an LR network as a student to boost LR recognition performance. Inspired by humans being able to approximate an object's region from an LR image based on prior knowledge obtained from HR images, we designed the knowledge distillation loss using the cosine similarity to make the student network's attention resemble the teacher network's attention. Experiments on various LR face related benchmarks confirmed the proposed method generally improved recognition performances on LR settings, outperforming state-of-the-art results by simply transferring well-constructed attention maps. The code and pretrained models are publicly available in the https://github.com/gist-ailab/teaching-where-to-look.

READ FULL TEXT

page 5

page 11

page 14

research
03/08/2023

Enhancing Low-resolution Face Recognition with Feature Similarity Knowledge Distillation

In this study, we introduce a feature knowledge distillation framework t...
research
11/25/2018

Low-resolution Face Recognition in the Wild via Selective Knowledge Distillation

Typically, the deployment of face recognition models in the wild needs t...
research
08/18/2023

CCFace: Classification Consistency for Low-Resolution Face Recognition

In recent years, deep face recognition methods have demonstrated impress...
research
06/03/2019

Deep Face Recognition Model Compression via Knowledge Transfer and Distillation

Fully convolutional networks (FCNs) have become de facto tool to achieve...
research
07/14/2022

Dynamic Low-Resolution Distillation for Cost-Efficient End-to-End Text Spotting

End-to-end text spotting has attached great attention recently due to it...
research
04/09/2019

Ultrafast Video Attention Prediction with Coupled Knowledge Distillation

Large convolutional neural network models have recently demonstrated imp...
research
04/10/2023

Grouped Knowledge Distillation for Deep Face Recognition

Compared with the feature-based distillation methods, logits distillatio...

Please sign up or login with your details

Forgot password? Click here to reset