Dimensions of Diversity in Human Perceptions of Algorithmic Fairness
Algorithms are increasingly involved in making decisions that affect human lives. Prior work has explored how people believe algorithmic decisions should be made, but there is little understanding of which individual factors relate to variance in these beliefs across people. As an increasing emphasis is put on oversight boards and regulatory bodies, it is important to understand the biases that may affect human judgements about the fairness of algorithms. Building on factors found in moral foundations theory and egocentric fairness literature, we explore how people's perceptions of fairness relate to their (i) demographics (age, race, gender, political view), and (ii) personal experiences with the algorithmic task being evaluated. Specifically, we study human beliefs about the fairness of using different features in an algorithm designed to assist judges in making decisions about granting bail. Our analysis suggests that political views and certain demographic factors, such as age and gender, exhibit a significant relation to people's beliefs about fairness. Additionally, we find that people beliefs about the fairness of using demographic features such as age, gender and race, for making bail decisions about others, vary egocentrically: that is they vary depending on their own age, gender and race respectively.
READ FULL TEXT