An Intersectional Definition of Fairness

07/22/2018
by   James Foulds, et al.
0

We introduce a measure of fairness for algorithms and data with regard to multiple protected attributes. Our proposed definition, differential fairness, is informed by the framework of intersectionality, which analyzes how interlocking systems of power and oppression affect individuals along overlapping dimensions including race, gender, sexual orientation, class, and disability. We show that our criterion behaves sensibly for any subset of the protected attributes, and we illustrate links to differential privacy. A case study on census data demonstrates the utility of our approach.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/18/2018

Bayesian Modeling of Intersectional Fairness: The Variance of Bias

Intersectionality is a framework that analyzes how interlocking systems ...
research
11/03/2022

Can Querying for Bias Leak Protected Attributes? Achieving Privacy With Smooth Sensitivity

Existing regulations prohibit model developers from accessing protected ...
research
05/15/2021

Fairly Private Through Group Tagging and Relation Impact

Privacy and Fairness both are very important nowadays. For most of the c...
research
05/19/2023

Group fairness without demographics using social networks

Group fairness is a popular approach to prevent unfavorable treatment of...
research
01/18/2021

Optimal Pre-Processing to Achieve Fairness and Its Relationship with Total Variation Barycenter

We use disparate impact, i.e., the extent that the probability of observ...
research
06/05/2023

Towards Fairness in Personalized Ads Using Impression Variance Aware Reinforcement Learning

Variances in ad impression outcomes across demographic groups are increa...
research
06/30/2021

Unaware Fairness: Hierarchical Random Forest for Protected Classes

Procedural fairness has been a public concern, which leads to controvers...

Please sign up or login with your details

Forgot password? Click here to reset