Assessing Gender Bias in Predictive Algorithms using eXplainable AI

03/19/2022
by   Cristina Manresa-Yee, et al.
0

Predictive algorithms have a powerful potential to offer benefits in areas as varied as medicine or education. However, these algorithms and the data they use are built by humans, consequently, they can inherit the bias and prejudices present in humans. The outcomes can systematically repeat errors that create unfair results, which can even lead to situations of discrimination (e.g. gender, social or racial). In order to illustrate how important is to count with a diverse training dataset to avoid bias, we manipulate a well-known facial expression recognition dataset to explore gender bias and discuss its implications.

READ FULL TEXT

page 5

page 6

page 7

research
03/21/2021

Responsible AI: Gender bias assessment in emotion recognition

Rapid development of artificial intelligence (AI) systems amplify many c...
research
10/11/2022

Gender Stereotyping Impact in Facial Expression Recognition

Facial Expression Recognition (FER) uses images of faces to identify the...
research
12/16/2021

Gendered Language in Resumes and its Implications for Algorithmic Bias in Hiring

Despite growing concerns around gender bias in NLP models used in algori...
research
05/20/2022

Assessing Demographic Bias Transfer from Dataset to Model: A Case Study in Facial Expression Recognition

The increasing amount of applications of Artificial Intelligence (AI) ha...
research
05/14/2020

Mitigating Gender Bias in Machine Learning Data Sets

Algorithmic bias has the capacity to amplify and perpetuatesocietal bias...
research
02/01/2019

Examining the Presence of Gender Bias in Customer Reviews Using Word Embedding

Humans have entered the age of algorithms. Each minute, algorithms shape...
research
04/19/2023

Introducing Construct Theory as a Standard Methodology for Inclusive AI Models

Construct theory in social psychology, developed by George Kelly are men...

Please sign up or login with your details

Forgot password? Click here to reset