Fantasizing with Dual GPs in Bayesian Optimization and Active Learning

11/02/2022
by   Paul E. Chang, et al.
0

Gaussian processes (GPs) are the main surrogate functions used for sequential modelling such as Bayesian Optimization and Active Learning. Their drawbacks are poor scaling with data and the need to run an optimization loop when using a non-Gaussian likelihood. In this paper, we focus on `fantasizing' batch acquisition functions that need the ability to condition on new fantasized data computationally efficiently. By using a sparse Dual GP parameterization, we gain linear scaling with batch size as well as one-step updates for non-Gaussian likelihoods, thus extending sparse models to greedy batch fantasizing acquisition functions.

READ FULL TEXT
research
06/06/2023

Memory-Based Dual Gaussian Processes for Sequential Learning

Sequential learning with Gaussian processes (GPs) is challenging when ac...
research
05/23/2022

Bayesian Active Learning with Fully Bayesian Gaussian Processes

The bias-variance trade-off is a well-known problem in machine learning ...
research
04/23/2020

On Bayesian Search for the Feasible Space Under Computationally Expensive Constraints

We are often interested in identifying the feasible subset of a decision...
research
06/18/2020

Likelihood-Free Inference with Deep Gaussian Processes

In recent years, surrogate models have been successfully used in likelih...
research
03/09/2020

Modelling Human Active Search in Optimizing Black-box Functions

Modelling human function learning has been the subject of in-tense resea...
research
04/29/2023

Representing Additive Gaussian Processes by Sparse Matrices

Among generalized additive models, additive Matérn Gaussian Processes (G...
research
12/13/2019

Active emulation of computer codes with Gaussian processes – Application to remote sensing

Many fields of science and engineering rely on running simulations with ...

Please sign up or login with your details

Forgot password? Click here to reset