Fairness of Exposure in Stochastic Bandits

03/03/2021
by   Lequn Wang, et al.
2

Contextual bandit algorithms have become widely used for recommendation in online systems (e.g. marketplaces, music streaming, news), where they now wield substantial influence on which items get exposed to the users. This raises questions of fairness to the items – and to the sellers, artists, and writers that benefit from this exposure. We argue that the conventional bandit formulation can lead to an undesirable and unfair winner-takes-all allocation of exposure. To remedy this problem, we propose a new bandit objective that guarantees merit-based fairness of exposure to the items while optimizing utility to the users. We formulate fairness regret and reward regret in this setting, and present algorithms for both stochastic multi-armed bandits and stochastic linear bandits. We prove that the algorithms achieve sub-linear fairness regret and reward regret. Beyond the theoretical analysis, we also provide empirical evidence that these algorithms can fairly allocate exposure to different arms effectively.

READ FULL TEXT

page 27

page 28

research
11/15/2022

On Penalization in Stochastic Multi-armed Bandits

We study an important variant of the stochastic multi-armed bandit (MAB)...
research
10/18/2022

Contextual bandits with concave rewards, and an application to fair ranking

We consider Contextual Bandits with Concave Rewards (CBCR), a multi-obje...
research
10/22/2020

Achieving User-Side Fairness in Contextual Bandits

Personalized recommendation based on multi-arm bandit (MAB) algorithms h...
research
09/21/2021

Achieving Counterfactual Fairness for Causal Bandit

In online recommendation, customers arrive in a sequential and stochasti...
research
06/23/2023

Trading-off price for data quality to achieve fair online allocation

We consider the problem of online allocation subject to a long-term fair...
research
11/14/2019

Unreliable Multi-Armed Bandits: A Novel Approach to Recommendation Systems

We use a novel modification of Multi-Armed Bandits to create a new model...
research
09/14/2020

Carousel Personalization in Music Streaming Apps with Contextual Bandits

Media services providers, such as music streaming platforms, frequently ...

Please sign up or login with your details

Forgot password? Click here to reset