On No-Sensing Adversarial Multi-player Multi-armed Bandits with Collision Communications

11/02/2020
by   Chengshuai Shi, et al.
5

We study the notoriously difficult no-sensing adversarial multi-player multi-armed bandits (MP-MAB) problem from a new perspective. Instead of focusing on the hardness of multiple players, we introduce a new dimension of hardness, called attackability. All adversaries can be categorized based on the attackability and we introduce Adversary-Adaptive Collision-Communication (A2C2), a family of algorithms with forced-collision communication among players. Both attackability-aware and unaware settings are studied, and information-theoretic tools of the Z-channel model and error-correction coding are utilized to address the challenge of implicit communication without collision information in an adversarial environment. For the more challenging attackability-unaware problem, we propose a simple method to estimate the attackability enabled by a novel error-detection repetition code and randomized communication for synchronization. Theoretical analysis proves that asymptotic attackability-dependent sublinear regret can be achieved, with or without knowing the attackability. In particular, the asymptotic regret does not have an exponential dependence on the number of players, revealing a fundamental tradeoff between the two dimensions of hardness in this problem.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/25/2021

Multi-player Multi-armed Bandits with Collision-Dependent Reward Distributions

We study a new stochastic multi-player multi-armed bandits (MP-MAB) prob...
research
02/29/2020

Decentralized Multi-player Multi-armed Bandits with No Collision Information

The decentralized stochastic multi-player multi-armed bandit (MP-MAB) pr...
research
04/28/2019

Non-Stochastic Multi-Player Multi-Armed Bandits: Optimal Rate With Collision Information, Sublinear Without

We consider the non-stochastic version of the (cooperative) multi-player...
research
11/08/2021

An Instance-Dependent Analysis for the Cooperative Multi-Player Multi-Armed Bandit

We study the problem of information sharing and cooperation in Multi-Pla...
research
09/17/2018

Multi-Player Bandits: A Trekking Approach

We study stochastic multi-armed bandits with many players. The players d...
research
08/25/2018

Multiplayer bandits without observing collision information

We study multiplayer stochastic multi-armed bandit problems in which the...
research
02/10/2021

Player Modeling via Multi-Armed Bandits

This paper focuses on building personalized player models solely from pl...

Please sign up or login with your details

Forgot password? Click here to reset