Optimistic optimization of a Brownian

01/15/2019
by   Jean-Bastien Grill, et al.
0

We address the problem of optimizing a Brownian motion. We consider a (random) realization W of a Brownian motion with input space in [0,1]. Given W, our goal is to return an ϵ-approximation of its maximum using the smallest possible number of function evaluations, the sample complexity of the algorithm. We provide an algorithm with sample complexity of order ^2(1/ϵ). This improves over previous results of Al-Mharmah and Calvin (1996) and Calvin et al. (2017) which provided only polynomial rates. Our algorithm is adaptive---each query depends on previous values---and is an instance of the optimism-in-the-face-of-uncertainty principle.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/22/2019

Privately Learning Thresholds: Closing the Exponential Gap

We study the sample complexity of learning threshold functions under the...
research
05/19/2022

Estimation of Entropy in Constant Space with Improved Sample Complexity

Recent work of Acharya et al. (NeurIPS 2019) showed how to estimate the ...
research
10/09/2021

A Faster Algorithm for Max Cut in Dense Graphs

We design an algorithm for approximating the size of Max Cut in dense gr...
research
11/13/2022

Near-Linear Sample Complexity for L_p Polynomial Regression

We study L_p polynomial regression. Given query access to a function f:[...
research
10/22/2022

On-Demand Sampling: Learning Optimally from Multiple Distributions

Social and real-world considerations such as robustness, fairness, socia...
research
08/10/2020

Statistical Query Lower Bounds for Tensor PCA

In the Tensor PCA problem introduced by Richard and Montanari (2014), on...
research
11/25/2022

Automata Cascades: Expressivity and Sample Complexity

Every automaton can be decomposed into a cascade of basic automata. This...

Please sign up or login with your details

Forgot password? Click here to reset