Gender Stereotyping Impact in Facial Expression Recognition

10/11/2022
by   Iris Dominguez-Catena, et al.
0

Facial Expression Recognition (FER) uses images of faces to identify the emotional state of users, allowing for a closer interaction between humans and autonomous systems. Unfortunately, as the images naturally integrate some demographic information, such as apparent age, gender, and race of the subject, these systems are prone to demographic bias issues. In recent years, machine learning-based models have become the most popular approach to FER. These models require training on large datasets of facial expression images, and their generalization capabilities are strongly related to the characteristics of the dataset. In publicly available FER datasets, apparent gender representation is usually mostly balanced, but their representation in the individual label is not, embedding social stereotypes into the datasets and generating a potential for harm. Although this type of bias has been overlooked so far, it is important to understand the impact it may have in the context of FER. To do so, we use a popular FER dataset, FER+, to generate derivative datasets with different amounts of stereotypical bias by altering the gender proportions of certain labels. We then proceed to measure the discrepancy between the performance of the models trained on these datasets for the apparent gender groups. We observe a discrepancy in the recognition of certain emotions between genders of up to 29 % under the worst bias conditions. Our results also suggest a safety range for stereotypical bias in a dataset that does not appear to produce stereotypical bias in the resulting model. Our findings support the need for a thorough bias analysis of public datasets in problems like FER, where a global balance of demographic representation can still hide other types of bias that harm certain demographic groups.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/20/2022

Assessing Demographic Bias Transfer from Dataset to Model: A Case Study in Facial Expression Recognition

The increasing amount of applications of Artificial Intelligence (AI) ha...
research
10/18/2021

Domain Generalisation for Apparent Emotional Facial Expression Recognition across Age-Groups

Apparent emotional facial expression recognition has attracted a lot of ...
research
03/19/2022

Assessing Gender Bias in Predictive Algorithms using eXplainable AI

Predictive algorithms have a powerful potential to offer benefits in are...
research
05/15/2020

Investigating Bias in Deep Face Analysis: The KANFace Dataset and Empirical Study

Deep learning-based methods have pushed the limits of the state-of-the-a...
research
09/15/2023

Toward responsible face datasets: modeling the distribution of a disentangled latent space for sampling face images from demographic groups

Recently, it has been exposed that some modern facial recognition system...
research
07/21/2022

GBDF: Gender Balanced DeepFake Dataset Towards Fair DeepFake Detection

Facial forgery by deepfakes has raised severe societal concerns. Several...
research
04/03/2020

Demographic Bias: A Challenge for Fingervein Recognition Systems?

Recently, concerns regarding potential biases in the underlying algorith...

Please sign up or login with your details

Forgot password? Click here to reset