Combinatorial Bandits for Incentivizing Agents with Dynamic Preferences

07/06/2018
by   Tanner Fiez, et al.
0

The design of personalized incentives or recommendations to improve user engagement is gaining prominence as digital platform providers continually emerge. We propose a multi-armed bandit framework for matching incentives to users, whose preferences are unknown a priori and evolving dynamically in time, in a resource constrained environment. We design an algorithm that combines ideas from three distinct domains: (i) a greedy matching paradigm, (ii) the upper confidence bound algorithm (UCB) for bandits, and (iii) mixing times from the theory of Markov chains. For this algorithm, we provide theoretical bounds on the regret and demonstrate its performance via both synthetic and realistic (matching supply and demand in a bike-sharing platform) examples.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/11/2018

Incentives in the Dark: Multi-armed Bandits for Evolving Users with Unknown Type

Design of incentives or recommendations to users is becoming more common...
research
09/26/2013

Building Bridges: Viewing Active Learning from the Multi-Armed Bandit Lens

In this paper we propose a multi-armed bandit inspired, pool based activ...
research
07/21/2023

Bandits with Deterministically Evolving States

We propose a model for learning with bandit feedback while accounting fo...
research
02/25/2021

Federated Multi-armed Bandits with Personalization

A general framework of personalized federated multi-armed bandits (PF-MA...
research
01/24/2023

Double Matching Under Complementary Preferences

In this paper, we propose a new algorithm for addressing the problem of ...
research
04/26/2022

Thompson Sampling for Bandit Learning in Matching Markets

The problem of two-sided matching markets has a wide range of real-world...
research
01/22/2020

Incentivising Exploration and Recommendations for Contextual Bandits with Payments

We propose a contextual bandit based model to capture the learning and s...

Please sign up or login with your details

Forgot password? Click here to reset