Dynamic Spectrum Access using Stochastic Multi-User Bandits

01/12/2021
by   Meghana Bande, et al.
0

A stochastic multi-user multi-armed bandit framework is used to develop algorithms for uncoordinated spectrum access. In contrast to prior work, it is assumed that rewards can be non-zero even under collisions, thus allowing for the number of users to be greater than the number of channels. The proposed algorithm consists of an estimation phase and an allocation phase. It is shown that if every user adopts the algorithm, the system wide regret is order-optimal of order O(log T) over a time-horizon of duration T. The regret guarantees hold for both the cases where the number of users is greater than or less than the number of channels. The algorithm is extended to the dynamic case where the number of users in the system evolves over time, and is shown to lead to sub-linear regret.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/02/2018

Multi-User Multi-Armed Bandits for Uncoordinated Spectrum Access

A stochastic multi-user multi-armed bandit framework is used to develop ...
research
10/21/2019

Multi-player Multi-Armed Bandits with non-zero rewards on collisions for uncoordinated spectrum access

In this paper, we study the uncoordinated spectrum access problem using ...
research
10/21/2019

Multi-User MABs with User Dependent Rewards for Uncoordinated Spectrum Access

Multi-user multi-armed bandits have emerged as a good model for uncoordi...
research
05/27/2022

Fairness and Welfare Quantification for Regret in Multi-Armed Bandits

We extend the notion of regret with a welfarist perspective. Focussing o...
research
10/16/2017

SpecWatch: A Framework for Adversarial Spectrum Monitoring with Unknown Statistics

In cognitive radio networks (CRNs), dynamic spectrum access has been pro...
research
08/14/2018

Multi-user Communication Networks: A Coordinated Multi-armed Bandit Approach

Communication networks shared by many users are a widespread challenge n...
research
02/13/2015

Decision Maker using Coupled Incompressible-Fluid Cylinders

The multi-armed bandit problem (MBP) is the problem of finding, as accur...

Please sign up or login with your details

Forgot password? Click here to reset