Online Learning for Active Cache Synchronization

02/27/2020
by   Andrey Kolobov, et al.
7

Existing multi-armed bandit (MAB) models make two implicit assumptions: an arm generates a payoff only when it is played, and the agent observes every payoff that is generated. This paper introduces synchronization bandits, a MAB variant where all arms generate costs at all times, but the agent observes an arm's instantaneous cost only when the arm is played. Synchronization MABs are inspired by online caching scenarios such as Web crawling, where an arm corresponds to a cached item and playing the arm means downloading its fresh copy from a server. We present MirrorSync, an online learning algorithm for synchronization bandits, establish an adversarial regret of O(T^2/3) for it, and show how to make it efficient in practice.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/13/2020

Fair Algorithms for Multi-Agent Multi-Armed Bandits

We propose a multi-agent variant of the classical multi-armed bandit pro...
research
11/03/2020

Multi-armed Bandits with Cost Subsidy

In this paper, we consider a novel variant of the multi-armed bandit (MA...
research
05/30/2022

Optimistic Whittle Index Policy: Online Learning for Restless Bandits

Restless multi-armed bandits (RMABs) extend multi-armed bandits to allow...
research
05/31/2022

Online Meta-Learning in Adversarial Multi-Armed Bandits

We study meta-learning for adversarial multi-armed bandits. We consider ...
research
11/05/2019

Response Prediction for Low-Regret Agents

Companies like Google and Microsoft run billions of auctions every day t...
research
09/19/2021

Generalized Translation and Scale Invariant Online Algorithm for Adversarial Multi-Armed Bandits

We study the adversarial multi-armed bandit problem and create a complet...
research
08/01/2020

Data-Driven Bandit Learning for Proactive Cache Placement in Fog-Assisted IoT Systems

In Fog-assisted IoT systems, it is a common practice to cache popular co...

Please sign up or login with your details

Forgot password? Click here to reset