Gaussian Process Sampling and Optimization with Approximate Upper and Lower Bounds

10/22/2021
by   Vu Nguyen, et al.
13

Many functions have approximately-known upper and/or lower bounds, potentially aiding the modeling of such functions. In this paper, we introduce Gaussian process models for functions where such bounds are (approximately) known. More specifically, we propose the first use of such bounds to improve Gaussian process (GP) posterior sampling and Bayesian optimization (BO). That is, we transform a GP model satisfying the given bounds, and then sample and weight functions from its posterior. To further exploit these bounds in BO settings, we present bounded entropy search (BES) to select the point gaining the most information about the underlying function, estimated by the GP samples, while satisfying the output constraints. We characterize the sample variance bounds and show that the decision made by BES is explainable. Our proposed approach is conceptually straightforward and can be used as a plug in extension to existing methods for GP posterior sampling and Bayesian optimization.

READ FULL TEXT

page 8

page 14

page 17

research
09/15/2020

On Information Gain and Regret Bounds in Gaussian Process Bandits

Consider the sequential optimization of an expensive to evaluate and pos...
research
03/23/2020

Efficient Gaussian Process Bandits by Believing only Informative Actions

Bayesian optimization is a framework for global search via maximum a pos...
research
02/12/2020

Regret Bounds for Noise-Free Bayesian Optimization

Bayesian optimisation is a powerful method for non-convex black-box opti...
research
10/28/2018

Bounded Regression with Gaussian Process Projection

Examples with bound information on the regression function and density a...
research
06/21/2021

Machine Learning based optimization for interval uncertainty propagation with application to vibro-acoustic models

Two non-intrusive uncertainty propagation approaches are proposed for th...
research
02/11/2019

Harnessing Low-Fidelity Data to Accelerate Bayesian Optimization via Posterior Regularization

Bayesian optimization (BO) is a powerful derivative-free technique for g...

Please sign up or login with your details

Forgot password? Click here to reset