The True Sample Complexity of Identifying Good Arms

06/15/2019
by   Julian Katz-Samuels, et al.
0

We consider two multi-armed bandit problems with n arms: (i) given an ϵ > 0, identify an arm with mean that is within ϵ of the largest mean and (ii) given a threshold μ_0 and integer k, identify k arms with means larger than μ_0. Existing lower bounds and algorithms for the PAC framework suggest that both of these problems require Ω(n) samples. However, we argue that these definitions not only conflict with how these algorithms are used in practice, but also that these results disagree with intuition that says (i) requires only Θ(n/m) samples where m = |{ i : μ_i > _i ∈ [n]μ_i - ϵ}| and (ii) requires Θ(n/mk) samples where m = |{ i : μ_i > μ_0 }|. We provide definitions that formalize these intuitions, obtain lower bounds that match the above sample complexities, and develop explicit, practical algorithms that achieve nearly matching upper bounds.

READ FULL TEXT
research
06/17/2013

On Finding the Largest Mean Among Many

Sampling from distributions to find the one with the largest mean arises...
research
11/14/2018

Sample complexity of partition identification using multi-armed bandits

Given a vector of probability distributions, or arms, each of which can ...
research
05/04/2021

Optimal Algorithms for Range Searching over Multi-Armed Bandits

This paper studies a multi-armed bandit (MAB) version of the range-searc...
research
01/24/2019

PAC Identification of Many Good Arms in Stochastic Multi-Armed Bandits

We consider the problem of identifying any k out of the best m arms in a...
research
06/03/2019

MaxGap Bandit: Adaptive Algorithms for Approximate Ranking

This paper studies the problem of adaptively sampling from K distributio...
research
08/23/2015

The Max K-Armed Bandit: A PAC Lower Bound and tighter Algorithms

We consider the Max K-Armed Bandit problem, where a learning agent is fa...
research
09/21/2020

Robust Outlier Arm Identification

We study the problem of Robust Outlier Arm Identification (ROAI), where ...

Please sign up or login with your details

Forgot password? Click here to reset