Regret Bounds for Deterministic Gaussian Process Bandits

03/09/2012
by   Nando de Freitas, et al.
0

This paper analyses the problem of Gaussian process (GP) bandits with deterministic observations. The analysis uses a branch and bound algorithm that is related to the UCB algorithm of (Srinivas et al., 2010). For GPs with Gaussian observation noise, with variance strictly greater than zero, (Srinivas et al., 2010) proved that the regret vanishes at the approximate rate of O(1/√(t)), where t is the number of observations. To complement their result, we attack the deterministic case and attain a much faster exponential convergence rate. Under some regularity assumptions, we show that the regret decreases asymptotically according to O(e^-τ t/( t)^d/4) with high probability. Here, d is the dimension of the search space and τ is a constant that depends on the behaviour of the objective function near its global maximum.

READ FULL TEXT

page 6

page 14

research
06/27/2012

Exponential Regret Bounds for Gaussian Process Bandits with Deterministic Observations

This paper analyzes the problem of Gaussian process (GP) bandits with de...
research
12/05/2017

Gaussian Process bandits with adaptive discretization

In this paper, the problem of maximizing a black-box function f:X→R is s...
research
09/02/2022

Regret Analysis of Dyadic Search

We analyze the cumulative regret of the Dyadic Search algorithm of Bacho...
research
02/26/2019

Multiscale Gaussian Process Level Set Estimation

In this paper, the problem of estimating the level set of a black-box fu...
research
11/03/2019

Zeroth Order Non-convex optimization with Dueling-Choice Bandits

We consider a novel setting of zeroth order non-convex optimization, whe...
research
05/26/2022

Variance-Aware Sparse Linear Bandits

It is well-known that the worst-case minimax regret for sparse linear ba...
research
12/16/2020

Exponential Convergence Rate for the Asymptotic Optimality of Whittle Index Policy

We evaluate the performance of Whittle index policy for restless Markovi...

Please sign up or login with your details

Forgot password? Click here to reset