New Algorithms for Multiplayer Bandits when Arm Means Vary Among Players

02/04/2019
by   Emilie Kaufmann, et al.
0

We study multiplayer stochastic multi-armed bandit problems in which the players cannot communicate,and if two or more players pull the same arm, a collision occurs and the involved players receive zero reward.Moreover, we assume each arm has a different mean for each player. Let T denote the number of rounds.An algorithm with regret O(( T)^2+κ) for any constant κ was recently presented by Bistritz and Leshem (NeurIPS 2018), who left the existence of an algorithm with O( T) regret as an open question. In this paper, we provide an affirmative answer to this question in the case when there is a unique optimal assignment of players to arms. For the general case we present an algorithm with expected regret O(( T)^1+κ), for any κ>0.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/28/2019

An Optimal Algorithm in Multiplayer Multi-Armed Bandits

The paper addresses the Multiplayer Multi-Armed Bandit (MMAB) problem, w...
research
08/25/2018

Multiplayer bandits without observing collision information

We study multiplayer stochastic multi-armed bandit problems in which the...
research
09/17/2018

Multi-Player Bandits: A Trekking Approach

We study stochastic multi-armed bandits with many players. The players d...
research
02/04/2020

Selfish Robustness and Equilibria in Multi-Player Bandits

Motivated by cognitive radios, stochastic multi-player multi-armed bandi...
research
02/19/2022

The Pareto Frontier of Instance-Dependent Guarantees in Multi-Player Multi-Armed Bandits with no Communication

We study the stochastic multi-player multi-armed bandit problem. In this...
research
11/05/2018

Multi-armed Bandits with Compensation

We propose and study the known-compensation multi-arm bandit (KCMAB) pro...
research
04/28/2022

Multi-Player Multi-Armed Bandits with Finite Shareable Resources Arms: Learning Algorithms Applications

Multi-player multi-armed bandits (MMAB) study how decentralized players ...

Please sign up or login with your details

Forgot password? Click here to reset