Understanding Fairness of Gender Classification Algorithms Across Gender-Race Groups

09/24/2020
by   Anoop Krishnan, et al.
0

Automated gender classification has important applications in many domains, such as demographic research, law enforcement, online advertising, as well as human-computer interaction. Recent research has questioned the fairness of this technology across gender and race. Specifically, the majority of the studies raised the concern of higher error rates of the face-based gender classification system for darker-skinned people like African-American and for women. However, to date, the majority of existing studies were limited to African-American and Caucasian only. The aim of this paper is to investigate the differential performance of the gender classification algorithms across gender-race groups. To this aim, we investigate the impact of (a) architectural differences in the deep learning algorithms and (b) training set imbalance, as a potential source of bias causing differential performance across gender and race. Experimental investigations are conducted on two latest large-scale publicly available facial attribute datasets, namely, UTKFace and FairFace. The experimental results suggested that the algorithms with architectural differences varied in performance with consistency towards specific gender-race groups. For instance, for all the algorithms used, Black females (Black race in general) always obtained the least accuracy rates. Middle Eastern males and Latino females obtained higher accuracy rates most of the time. Training set imbalance further widens the gap in the unequal accuracy rates across all gender-race groups. Further investigations using facial landmarks suggested that facial morphological differences due to the bone structure influenced by genetic and environmental factors could be the cause of the least performance of Black females and Black race, in general.

READ FULL TEXT

page 1

page 4

research
08/17/2022

Deep Generative Views to Mitigate Gender Classification Bias Across Gender-Race Groups

Published studies have suggested the bias of automated face-based gender...
research
04/17/2022

QTBIPOC PD: Exploring the Intersections of Race, Gender, and Sexual Orientation in Participatory Design

As Human-Computer Interaction (HCI) research aims to be inclusive and re...
research
05/21/2020

Gender Slopes: Counterfactual Fairness for Computer Vision Models by Attribute Manipulation

Automated computer vision systems have been applied in many domains incl...
research
11/17/2020

Probing Fairness of Mobile Ocular Biometrics Methods Across Gender on VISOB 2.0 Dataset

Recent research has questioned the fairness of face-based recognition an...
research
08/09/2023

Are Sex-based Physiological Differences the Cause of Gender Bias for Chest X-ray Diagnosis?

While many studies have assessed the fairness of AI algorithms in the me...
research
12/12/2018

Considering Race a Problem of Transfer Learning

As biometric applications are fielded to serve large population groups, ...
research
12/01/2017

Improving Smiling Detection with Race and Gender Diversity

Recent progress in deep learning has been accompanied by a growing conce...

Please sign up or login with your details

Forgot password? Click here to reset