Graph Clustering Bandits for Recommendation

05/02/2016
by   Claudio Gentile, et al.
0

We investigate an efficient context-dependent clustering technique for recommender systems based on exploration-exploitation strategies through multi-armed bandits over multiple users. Our algorithm dynamically groups users based on their observed behavioral similarity during a sequence of logged activities. In doing so, the algorithm reacts to the currently served user by shaping clusters around him/her but, at the same time, it explores the generation of clusters over users which are not currently engaged. We motivate the effectiveness of this clustering policy, and provide an extensive empirical analysis on real-world datasets, showing scalability and improved prediction performance over state-of-the-art methods for sequential clustering of users in multi-armed bandit scenarios.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/12/2015

Context-Aware Bandits

We propose an efficient Context-Aware clustering of Bandits (CAB) algori...
research
02/11/2015

Collaborative Filtering Bandits

Classical collaborative filtering, and content-based filtering methods t...
research
07/31/2018

Graph-Based Recommendation System

In this work, we study recommendation systems modelled as contextual mul...
research
06/04/2013

A Gang of Bandits

Multi-armed bandit problems are receiving a great deal of attention beca...
research
06/11/2020

Bandit-PAM: Almost Linear Time k-Medoids Clustering via Multi-Armed Bandits

Clustering is a ubiquitous task in data science. Compared to the commonl...
research
04/16/2023

A Field Test of Bandit Algorithms for Recommendations: Understanding the Validity of Assumptions on Human Preferences in Multi-armed Bandits

Personalized recommender systems suffuse modern life, shaping what media...
research
08/06/2016

On Context-Dependent Clustering of Bandits

We investigate a novel cluster-of-bandit algorithm CAB for collaborative...

Please sign up or login with your details

Forgot password? Click here to reset