Distributed Clustering of Linear Bandits in Peer to Peer Networks

04/26/2016
by   Nathan Korda, et al.
0

We provide two distributed confidence ball algorithms for solving linear bandit problems in peer to peer networks with limited communication capabilities. For the first, we assume that all the peers are solving the same linear bandit problem, and prove that our algorithm achieves the optimal asymptotic regret rate of any centralised algorithm that can instantly communicate information between the peers. For the second, we assume that there are clusters of peers solving the same bandit problem within each cluster, and we prove that our algorithm discovers these clusters, while achieving the optimal asymptotic regret rate within each one. Through experiments on several real-world datasets, we demonstrate the performance of proposed algorithms compared to the state-of-the-art.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/31/2014

Online Clustering of Bandits

We introduce a novel algorithmic approach to content recommendation base...
research
07/16/2020

Fast Distributed Bandits for Online Recommendation Systems

Contextual bandit algorithms are commonly used in recommender systems, w...
research
05/30/2023

Cooperative Thresholded Lasso for Sparse Linear Bandit

We present a novel approach to address the multi-agent sparse contextual...
research
05/26/2022

Distributed Contextual Linear Bandits with Minimax Optimal Communication Cost

We study distributed contextual linear bandits with stochastic contexts,...
research
08/08/2023

Cooperative Multi-agent Bandits: Distributed Algorithms with Optimal Individual Regret and Constant Communication Costs

Recently, there has been extensive study of cooperative multi-agent mult...
research
06/07/2021

On Learning to Rank Long Sequences with Contextual Bandits

Motivated by problems of learning to rank long item sequences, we introd...
research
08/06/2016

On Context-Dependent Clustering of Bandits

We investigate a novel cluster-of-bandit algorithm CAB for collaborative...

Please sign up or login with your details

Forgot password? Click here to reset