The Use of Bandit Algorithms in Intelligent Interactive Recommender Systems

07/01/2021
by   Qing Wang, et al.
0

In today's business marketplace, many high-tech Internet enterprises constantly explore innovative ways to provide optimal online user experiences for gaining competitive advantages. The great needs of developing intelligent interactive recommendation systems are indicated, which could sequentially suggest users the most proper items by accurately predicting their preferences, while receiving the up-to-date feedback to refine the recommendation results, continuously. Multi-armed bandit algorithms, which have been widely applied into various online systems, are quite capable of delivering such efficient recommendation services. However, few existing bandit models are able to adapt to new changes introduced by the modern recommender systems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/16/2019

Accelerated learning from recommender systems using multi-armed bandit

Recommendation systems are a vital component of many online marketplaces...
research
01/12/2022

Proceedings of the 4th Workshop on Online Recommender Systems and User Modeling – ORSUM 2021

Modern online services continuously generate data at very fast rates. Th...
research
09/21/2020

Bandits Under The Influence (Extended Version)

Recommender systems should adapt to user interests as the latter evolve....
research
09/11/2021

Existence conditions for hidden feedback loops in online recommender systems

We explore a hidden feedback loops effect in online recommender systems....
research
07/01/2019

Bandit Learning for Diversified Interactive Recommendation

Interactive recommender systems that enable the interactions between use...
research
04/16/2023

A Field Test of Bandit Algorithms for Recommendations: Understanding the Validity of Assumptions on Human Preferences in Multi-armed Bandits

Personalized recommender systems suffuse modern life, shaping what media...
research
09/13/2020

Spoiled for Choice? Personalized Recommendation for Healthcare Decisions: A Multi-Armed Bandit Approach

Online healthcare communities provide users with various healthcare inte...

Please sign up or login with your details

Forgot password? Click here to reset