Sex and Gender in the Computer Graphics Research Literature

06/01/2022
by   Ana Dodik, et al.
0

We survey the treatment of sex and gender in the Computer Graphics research literature from an algorithmic fairness perspective. The established practices on the use of gender and sex in our community are scientifically incorrect and constitute a form of algorithmic bias with potential harmful effects. We propose ways of addressing these as technical limitations.

READ FULL TEXT

page 1

page 2

research
03/28/2021

Countering Racial Bias in Computer Graphics Research

Current computer graphics research practices contain racial biases that ...
research
12/16/2021

Gendered Language in Resumes and its Implications for Algorithmic Bias in Hiring

Despite growing concerns around gender bias in NLP models used in algori...
research
02/15/2022

Choosing an algorithmic fairness metric for an online marketplace: Detecting and quantifying algorithmic bias on LinkedIn

In this paper, we derive an algorithmic fairness metric for the recommen...
research
03/31/2021

Mitigating Bias in Algorithmic Systems: A Fish-Eye View of Problems and Solutions Across Domains

Mitigating bias in algorithmic systems is a critical issue drawing atten...
research
12/25/2017

Gender differences in beliefs about algorithmic fairness

The field of algorithmic fairness has highlighted ethical questions whic...
research
12/25/2017

Demographics and discussion influence views on algorithmic fairness

The field of algorithmic fairness has highlighted ethical questions whic...
research
04/17/2020

Wide range screening of algorithmic bias in word embedding models using large sentiment lexicons reveals underreported bias types

Concerns about gender bias in word embedding models have captured substa...

Please sign up or login with your details

Forgot password? Click here to reset