Fairness of Exposure in Rankings

02/20/2018
by   Ashudeep Singh, et al.
0

Rankings are ubiquitous in the online world today. As we have transitioned from finding books in libraries to ranking products, jobs, job applicants, opinions and potential romantic partners, there is a substantial precedent that ranking systems have a responsibility not only to their users but also to the items being ranked. To address these often conflicting responsibilities, we propose a conceptual and computational framework that allows the formulation of fairness constraints on rankings. As part of this framework, we develop efficient algorithms for finding rankings that maximize the utility for the user while satisfying fairness constraints for the items. Since fairness goals can be application specific, we show how a broad range of fairness constraints can be implemented in our framework, including forms of demographic parity, disparate treatment, and disparate impact constraints. We illustrate the effect of these constraints by providing empirical results on two ranking problems.

READ FULL TEXT

page 2

page 7

page 8

page 9

research
12/12/2021

Fairness for Robust Learning to Rank

While conventional ranking systems focus solely on maximizing the utilit...
research
05/04/2018

Equity of Attention: Amortizing Individual Fairness in Rankings

Rankings of people and items are at the heart of selection-making, match...
research
04/21/2018

A Nutritional Label for Rankings

Algorithmic decisions often result in scoring and ranking individuals to...
research
02/11/2019

Policy Learning for Fairness in Ranking

Conventional Learning-to-Rank (LTR) methods optimize the utility of the ...
research
08/22/2023

(Un)fair Exposure in Deep Face Rankings at a Distance

Law enforcement regularly faces the challenge of ranking suspects from t...
research
07/19/2019

Greedy Optimized Multileaving for Personalization

Personalization plays an important role in many services. To evaluate pe...

Please sign up or login with your details

Forgot password? Click here to reset