Bandits Under The Influence (Extended Version)

09/21/2020
by   Silviu Maniu, et al.
0

Recommender systems should adapt to user interests as the latter evolve. A prevalent cause for the evolution of user interests is the influence of their social circle. In general, when the interests are not known, online algorithms that explore the recommendation space while also exploiting observed preferences are preferable. We present online recommendation algorithms rooted in the linear multi-armed bandit literature. Our bandit algorithms are tailored precisely to recommendation scenarios where user interests evolve under social influence. In particular, we show that our adaptations of the classic LinREL and Thompson Sampling algorithms maintain the same asymptotic regret bounds as in the non-social case. We validate our approach experimentally using both synthetic and real datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/11/2021

Existence conditions for hidden feedback loops in online recommender systems

We explore a hidden feedback loops effect in online recommender systems....
research
07/01/2021

The Use of Bandit Algorithms in Intelligent Interactive Recommender Systems

In today's business marketplace, many high-tech Internet enterprises con...
research
07/31/2018

Graph-Based Recommendation System

In this work, we study recommendation systems modelled as contextual mul...
research
08/16/2019

Accelerated learning from recommender systems using multi-armed bandit

Recommendation systems are a vital component of many online marketplaces...
research
03/23/2018

Learning Recommendations While Influencing Interests

Personalized recommendation systems (RS) are extensively used in many se...
research
06/04/2013

A Gang of Bandits

Multi-armed bandit problems are receiving a great deal of attention beca...
research
07/21/2023

Bandits with Deterministically Evolving States

We propose a model for learning with bandit feedback while accounting fo...

Please sign up or login with your details

Forgot password? Click here to reset