Regularized Contextual Bandits

10/11/2018
by   Xavier Fontaine, et al.
0

We consider the stochastic contextual bandit problem with additional regularization. The motivation comes from problems where the policy of the agent must be close to some baseline policy which is known to perform well on the task. To tackle this problem we use a nonparametric model and propose an algorithm splitting the context space into bins, and solving simultaneously - and independently - regularized multi-armed bandit instances on each bin. We derive slow and fast rates of convergence, depending on the unknown complexity of the problem. We also consider a new relevant margin condition to get problem-independent convergence rates, ending up in intermediate convergence rates interpolating between the aforementioned slow and fast rates.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/24/2019

OSOM: A Simultaneously Optimal Algorithm for Multi-Armed and Linear Contextual Bandits

We consider the stochastic linear (multi-armed) contextual bandit proble...
research
01/05/2018

Nonparametric Stochastic Contextual Bandits

We analyze the K-armed bandit problem where the reward for each arm is a...
research
02/17/2015

Regret bounds for Narendra-Shapiro bandit algorithms

Narendra-Shapiro (NS) algorithms are bandit-type algorithms that have be...
research
01/28/2020

Faster Activity and Data Detection in Massive Random Access: A Multi-armed Bandit Approach

This paper investigates the grant-free random access with massive IoT de...
research
01/18/2011

Convergence rates of efficient global optimization algorithms

Efficient global optimization is the problem of minimizing an unknown fu...
research
02/01/2021

Fast rates in structured prediction

Discrete supervised learning problems such as classification are often t...
research
08/06/2021

Joint AP Probing and Scheduling: A Contextual Bandit Approach

We consider a set of APs with unknown data rates that cooperatively serv...

Please sign up or login with your details

Forgot password? Click here to reset