Scalable Variational Bayesian Kernel Selection for Sparse Gaussian Process Regression

12/05/2019
by   Tong Teng, et al.
28

This paper presents a variational Bayesian kernel selection (VBKS) algorithm for sparse Gaussian process regression (SGPR) models. In contrast to existing GP kernel selection algorithms that aim to select only one kernel with the highest model evidence, our proposed VBKS algorithm considers the kernel as a random variable and learns its belief from data such that the uncertainty of the kernel can be interpreted and exploited to avoid overconfident GP predictions. To achieve this, we represent the probabilistic kernel as an additional variational variable in a variational inference (VI) framework for SGPR models where its posterior belief is learned together with that of the other variational variables (i.e., inducing variables and kernel hyperparameters). In particular, we transform the discrete kernel belief into a continuous parametric distribution via reparameterization in order to apply VI. Though it is computationally challenging to jointly optimize a large number of hyperparameters due to many kernels being evaluated simultaneously by our VBKS algorithm, we show that the variational lower bound of the log-marginal likelihood can be decomposed into an additive form such that each additive term depends only on a disjoint subset of the variational variables and can thus be optimized independently. Stochastic optimization is then used to maximize the variational lower bound by iteratively improving the variational approximation of the exact posterior belief via stochastic gradient ascent, which incurs constant time per iteration and hence scales to big data. We empirically evaluate the performance of our VBKS algorithm on synthetic and massive real-world datasets.

READ FULL TEXT
research
11/01/2017

Stochastic Variational Inference for Fully Bayesian Sparse Gaussian Process Regression Models

This paper presents a novel variational inference framework for deriving...
research
05/25/2019

Sparse Gaussian Process Modulated Hawkes Process

The Hawkes process has been widely applied to modeling self-exciting eve...
research
10/24/2020

Variational Bayesian Unlearning

This paper studies the problem of approximately unlearning a Bayesian mo...
research
09/14/2019

Scalable Gaussian Process Classification with Additive Noise for Various Likelihoods

Gaussian process classification (GPC) provides a flexible and powerful s...
research
11/18/2016

A Generalized Stochastic Variational Bayesian Hyperparameter Learning Framework for Sparse Spectrum Gaussian Process Regression

While much research effort has been dedicated to scaling up sparse Gauss...
research
10/26/2019

Implicit Posterior Variational Inference for Deep Gaussian Processes

A multi-layer deep Gaussian process (DGP) model is a hierarchical compos...
research
10/29/2020

Gaussian Process Bandit Optimization of the Thermodynamic Variational Objective

Achieving the full promise of the Thermodynamic Variational Objective (T...

Please sign up or login with your details

Forgot password? Click here to reset