Bayesian Optimization – Multi-Armed Bandit Problem

12/14/2020
by   Abhilash Nandy, et al.
9

In this report, we survey Bayesian Optimization methods focussed on the Multi-Armed Bandit Problem. We take the help of the paper "Portfolio Allocation for Bayesian Optimization". We report a small literature survey on the acquisition functions and the types of portfolio strategies used in papers discussing Bayesian Optimization. We also replicate the experiments and report our findings and compare them to the results in the paper. Code link: https://colab.research.google.com/drive/1GZ14klEDoe3dcBeZKo5l8qqrKf_GmBDn?usp=sharing#scrollTo=XgIBau3O45_V.

READ FULL TEXT
research
03/27/2013

Exploiting correlation and budget constraints in Bayesian multi-armed bandit optimization

We address the problem of finding the maximizer of a nonlinear smooth fu...
research
04/25/2012

Regret Analysis of Stochastic and Nonstochastic Multi-armed Bandit Problems

Multi-armed bandit problems are the most basic examples of sequential de...
research
11/28/2019

Bayesian Optimization for Categorical and Category-Specific Continuous Inputs

Many real-world functions are defined over both categorical and category...
research
01/16/2023

Bayesian and Multi-Armed Contextual Meta-Optimization for Efficient Wireless Radio Resource Management

Optimal resource allocation in modern communication networks calls for t...
research
06/20/2019

The Finite-Horizon Two-Armed Bandit Problem with Binary Responses: A Multidisciplinary Survey of the History, State of the Art, and Myths

In this paper we consider the two-armed bandit problem, which often natu...
research
03/07/2023

PyXAB – A Python Library for 𝒳-Armed Bandit and Online Blackbox Optimization Algorithms

We introduce a Python open-source library for 𝒳-armed bandit and online ...
research
02/17/2021

BORE: Bayesian Optimization by Density-Ratio Estimation

Bayesian optimization (BO) is among the most effective and widely-used b...

Please sign up or login with your details

Forgot password? Click here to reset