-
Nearly Optimal Sampling Algorithms for Combinatorial Pure Exploration
We study the combinatorial pure exploration problem Best-Set in stochast...
read it
-
An Empirical Process Approach to the Union Bound: Practical Algorithms for Combinatorial and Linear Bandits
This paper proposes near-optimal algorithms for the pure-exploration lin...
read it
-
Combinatorial Pure Exploration with Bottleneck Reward Function and its Extension to General Reward Functions
In this paper, we study the Combinatorial Pure Exploration problem with ...
read it
-
Polynomial-time Algorithms for Combinatorial Pure Exploration with Full-bandit Feedback
We study the problem of stochastic combinatorial pure exploration (CPE),...
read it
-
Making the Cut: A Bandit-based Approach to Tiered Interviewing
Given a huge set of applicants, how should a firm allocate sequential re...
read it
-
Instance-Sensitive Algorithms for Pure Exploration in Multinomial Logit Bandit
Motivated by real-world applications such as fast fashion retailing and ...
read it
Disagreement-based combinatorial pure exploration: Efficient algorithms and an analysis with localization
We design new algorithms for the combinatorial pure exploration problem in the multi-arm bandit framework. In this problem, we are given K distributions and a collection of subsets V⊂ 2^K of these distributions, and we would like to find the subset v ∈V that has largest cumulative mean, while collecting, in a sequential fashion, as few samples from the distributions as possible. We study both the fixed budget and fixed confidence settings, and our algorithms essentially achieve state-of-the-art performance in all settings, improving on previous guarantees for structures like matchings and submatrices that have large augmenting sets. Moreover, our algorithms can be implemented efficiently whenever the decision set V admits linear optimization. Our analysis involves precise concentration-of-measure arguments and a new algorithm for linear programming with exponentially many constraints.
READ FULL TEXT
Comments
There are no comments yet.