Marketers employ various online advertising channels to reach customers,...
Bayesian bandit algorithms with approximate inference have been widely u...
In causal bandit problems, the action set consists of interventions on
v...
We introduce causal Markov Decision Processes (C-MDPs), a new formalism ...
In a low-rank linear bandit problem, the reward of an action (represente...
We study how to learn optimal interventions sequentially given causal
in...