Introduction to Multi-Armed Bandits

04/15/2019
by   Aleksandrs Slivkins, et al.
0

Multi-armed bandits a simple but very powerful framework for algorithms that make decisions over time under uncertainty. An enormous body of work has accumulated over the years, covered in several books and surveys. This book provides a more introductory, textbook-like treatment of the subject. Each chapter tackles a particular line of work, providing a self-contained, teachable technical introduction and a review of the more advanced results. The chapters are as follows: Stochastic bandits; Lower bounds; Bayesian Bandits and Thompson Sampling; Lipschitz Bandits; Full Feedback and Adversarial Costs; Adversarial Bandits; Linear Costs and Semi-bandits; Contextual Bandits; Bandits and Zero-Sum Games; Bandits with Knapsacks; Incentivized Exploration and Connections to Mechanism Design. Status of the manuscript: essentially complete (modulo some polishing), except for last chapter, which the author plans to add over the next few months.

READ FULL TEXT
research
05/10/2013

Exponentiated Gradient LINUCB for Contextual Multi-Armed Bandits

We present Exponentiated Gradient LINUCB, an algorithm for con-textual m...
research
02/22/2019

Better Algorithms for Stochastic Bandits with Adversarial Corruptions

We study the stochastic multi-armed bandits problem in the presence of a...
research
07/12/2019

Gittins' theorem under uncertainty

We study dynamic allocation problems for discrete time multi-armed bandi...
research
02/28/2017

Stacked Thompson Bandits

We introduce Stacked Thompson Bandits (STB) for efficiently generating p...
research
11/11/2018

Adapting multi-armed bandits policies to contextual bandits scenarios

This work explores adaptations of successful multi-armed bandits policie...
research
06/22/2021

A Unified Framework for Conservative Exploration

We study bandits and reinforcement learning (RL) subject to a conservati...
research
11/09/2017

Action Centered Contextual Bandits

Contextual bandits have become popular as they offer a middle ground bet...

Please sign up or login with your details

Forgot password? Click here to reset