Teacher-Student Training and Triplet Loss to Reduce the Effect of Drastic Face Occlusion

We study a series of recognition tasks in two realistic scenarios requiring the analysis of faces under strong occlusion. On the one hand, we aim to recognize facial expressions of people wearing Virtual Reality (VR) headsets. On the other hand, we aim to estimate the age and identify the gender of people wearing surgical masks. For all these tasks, the common ground is that half of the face is occluded. In this challenging setting, we show that convolutional neural networks (CNNs) trained on fully-visible faces exhibit very low performance levels. While fine-tuning the deep learning models on occluded faces is extremely useful, we show that additional performance gains can be obtained by distilling knowledge from models trained on fully-visible faces. To this end, we study two knowledge distillation methods, one based on teacher-student training and one based on triplet loss. Our main contribution consists in a novel approach for knowledge distillation based on triplet loss, which generalizes across models and tasks. Furthermore, we consider combining distilled models learned through conventional teacher-student training or through our novel teacher-student training based on triplet loss. We provide empirical evidence showing that, in most cases, both individual and combined knowledge distillation methods bring statistically significant performance improvements. We conduct experiments with three different neural models (VGG-f, VGG-face, ResNet-50) on various tasks (facial expression recognition, gender recognition, age estimation), showing consistent improvements regardless of the model or task.

READ FULL TEXT

page 11

page 14

page 16

research
08/03/2020

Teacher-Student Training and Triplet Loss for Facial Expression Recognition under Occlusion

In this paper, we study the task of facial expression recognition under ...
research
11/12/2019

Recognizing Facial Expressions of Occluded Faces using Convolutional Neural Networks

In this paper, we present an approach based on convolutional neural netw...
research
11/25/2018

Low-resolution Face Recognition in the Wild via Selective Knowledge Distillation

Typically, the deployment of face recognition models in the wild needs t...
research
05/25/2023

Triplet Knowledge Distillation

In Knowledge Distillation, the teacher is generally much larger than the...
research
06/06/2022

Evaluation-oriented Knowledge Distillation for Deep Face Recognition

Knowledge distillation (KD) is a widely-used technique that utilizes lar...
research
04/07/2023

Masked Student Dataset of Expressions

Facial expression recognition (FER) algorithms work well in constrained ...

Please sign up or login with your details

Forgot password? Click here to reset