Contextual Bandits with Sparse Data in Web setting

05/06/2021
by   Björn H Eriksson, et al.
0

This paper is a scoping study to identify current methods used in handling sparse data with contextual bandits in web settings. The area is highly current and state of the art methods are identified. The years 2017-2020 are investigated, and 19 method articles are identified, and two review articles. Five categories of methods are described, making it easy to choose how to address sparse data using contextual bandits with a method available for modification in the specific setting of concern. In addition, each method has multiple techniques to choose from for future evaluation. The problem areas are also mentioned that each article covers. An overall updated understanding of sparse data problems using contextual bandits in web settings is given. The identified methods are policy evaluation (off-line and on-line) , hybrid-method, model representation (clusters and deep neural networks), dimensionality reduction, and simulation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/09/2022

Conformal Off-Policy Prediction in Contextual Bandits

Most off-policy evaluation methods for contextual bandits have focused o...
research
03/20/2019

Contextual Bandits with Random Projection

Contextual bandits with linear payoffs, which are also known as linear b...
research
11/14/2022

Efficient Contextual Bandits with Knapsacks via Regression

We consider contextual bandits with knapsacks (CBwK), a variant of the c...
research
12/15/2018

Balanced Linear Contextual Bandits

Contextual bandit algorithms are sensitive to the estimation method of t...
research
06/10/2020

Efficient Contextual Bandits with Continuous Actions

We create a computationally tractable algorithm for contextual bandits w...
research
05/17/2022

Semi-Parametric Contextual Bandits with Graph-Laplacian Regularization

Non-stationarity is ubiquitous in human behavior and addressing it in th...

Please sign up or login with your details

Forgot password? Click here to reset