Distill and De-bias: Mitigating Bias in Face Recognition using Knowledge Distillation

12/17/2021
by   Prithviraj Dhar, et al.
6

Face recognition networks generally demonstrate bias with respect to sensitive attributes like gender, skintone etc. For gender and skintone, we observe that the regions of the face that a network attends to vary by the category of an attribute. This might contribute to bias. Building on this intuition, we propose a novel distillation-based approach called Distill and De-bias (D D) to enforce a network to attend to similar face regions, irrespective of the attribute category. In D D, we train a teacher network on images from one category of an attribute; e.g. light skintone. Then distilling information from the teacher, we train a student network on images of the remaining category; e.g., dark skintone. A feature-level distillation loss constrains the student network to generate teacher-like representations. This allows the student network to attend to similar face regions for all attribute categories and enables it to reduce bias. We also propose a second distillation step on top of D D, called D D++. For the D D++ network, we distill the `un-biasedness' of the D D network into a new student network, the D D++ network. We train the new network on all attribute categories; e.g., both light and dark skintones. This helps us train a network that is less biased for an attribute, while obtaining higher face verification performance than D D. We show that D D++ outperforms existing baselines in reducing gender and skintone bias on the IJB-C dataset, while obtaining higher face verification performance than existing adversarial de-biasing methods. We evaluate the effectiveness of our proposed methods on two state-of-the-art face recognition networks: Crystalface and ArcFace.

READ FULL TEXT

page 5

page 6

page 8

page 13

page 17

research
06/14/2020

An adversarial learning algorithm for mitigating gender bias in face recognition

State-of-the-art face recognition networks implicitly encode gender info...
research
08/09/2021

PASS: Protected Attribute Suppression System for Mitigating Bias in Face Recognition

Face recognition networks encode information about sensitive attributes ...
research
09/09/2017

Model Distillation with Knowledge Transfer from Face Classification to Alignment and Verification

Knowledge distillation is a potential solution for model compression. Th...
research
04/10/2023

Grouped Knowledge Distillation for Deep Face Recognition

Compared with the feature-based distillation methods, logits distillatio...
research
10/24/2022

Mitigating Gender Bias in Face Recognition Using the von Mises-Fisher Mixture Model

In spite of the high performance and reliability of deep learning algori...
research
06/06/2022

Evaluation-oriented Knowledge Distillation for Deep Face Recognition

Knowledge distillation (KD) is a widely-used technique that utilizes lar...
research
10/11/2019

VarGFaceNet: An Efficient Variable Group Convolutional Neural Network for Lightweight Face Recognition

To improve the discriminative and generalization ability of lightweight ...

Please sign up or login with your details

Forgot password? Click here to reset