Self-fulfilling Bandits: Endogeneity Spillover and Dynamic Selection in Algorithmic Decision-making

08/28/2021
by   Jin Li, et al.
0

In this paper, we study endogeneity problems in algorithmic decision-making where data and actions are interdependent. When there are endogenous covariates in a contextual multi-armed bandit model, a novel bias (self-fulfilling bias) arises because the endogeneity of the covariates spills over to the actions. We propose a class of algorithms to correct for the bias by incorporating instrumental variables into leading online learning algorithms. These algorithms also attain regret levels that match the best known lower bound for the cases without endogeneity. To establish the theoretical properties, we develop a general technique that untangles the interdependence between data and actions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/02/2015

A Survey of Online Experiment Design with the Stochastic Multi-Armed Bandit

Adaptive and sequential experiment design is a well-studied area in nume...
research
09/22/2021

On Optimal Robustness to Adversarial Corruption in Online Decision Problems

This paper considers two fundamental sequential decision-making problems...
research
04/26/2022

Evolutionary Multi-Armed Bandits with Genetic Thompson Sampling

As two popular schools of machine learning, online learning and evolutio...
research
02/05/2015

RELEAF: An Algorithm for Learning and Exploiting Relevance

Recommender systems, medical diagnosis, network security, etc., require ...
research
02/14/2023

Effective Dimension in Bandit Problems under Censorship

In this paper, we study both multi-armed and contextual bandit problems ...
research
06/25/2021

Dealing with Expert Bias in Collective Decision-Making

Quite some real-world problems can be formulated as decision-making prob...
research
07/22/2022

High dimensional stochastic linear contextual bandit with missing covariates

Recent works in bandit problems adopted lasso convergence theory in the ...

Please sign up or login with your details

Forgot password? Click here to reset