Practical Hilbert space approximate Bayesian Gaussian processes for probabilistic programming

04/23/2020
by   Gabriel Riutort-Mayol, et al.
0

Gaussian processes are powerful non-parametric probabilistic models for stochastic functions. However they entail a complexity that is computationally intractable when the number of observations is large, especially when estimated with fully Bayesian methods such as Markov chain Monte Carlo. In this paper, we focus on a novel approach for low-rank approximate Bayesian Gaussian processes, based on a basis function approximation via Laplace eigenfunctions for stationary covariance functions. The main contribution of this paper is a detailed analysis of the performance and practical implementation of the method in relation to key factors such as the number of basis functions, domain of the prediction space, and smoothness of the latent function. We provide intuitive visualizations and recommendations for choosing the values of these factors, which make it easier for users to improve approximation accuracy and computational performance. We also propose diagnostics for checking that the number of basis functions and the domain of the prediction space are adequate given the data. The proposed approach is simple and exhibits an attractive computational complexity due to its linear structure, and it is easy to implement in probabilistic programming frameworks. Several illustrative examples of the performance and applicability of the method in the probabilistic programming language Stan are presented together with the underlying Stan model code.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/25/2012

Bayesian Modeling with Gaussian Processes using the GPstuff Toolbox

Gaussian processes (GP) are powerful tools for probabilistic modeling pu...
research
11/23/2018

Semivariogram Hyper-Parameter Estimation for Whittle-Matérn Priors in Bayesian Inverse Problems

We present a detailed mathematical description of the connection between...
research
10/21/2019

Approximate Sampling using an Accelerated Metropolis-Hastings based on Bayesian Optimization and Gaussian Processes

Markov Chain Monte Carlo (MCMC) methods have a drawback when working wit...
research
02/28/2023

Bayesian Kernelized Tensor Factorization as Surrogate for Bayesian Optimization

Bayesian optimization (BO) primarily uses Gaussian processes (GP) as the...
research
07/14/2021

Spectrum Gaussian Processes Based On Tunable Basis Functions

Spectral approximation and variational inducing learning for the Gaussia...
research
05/03/2022

Bézier Curve Gaussian Processes

Probabilistic models for sequential data are the basis for a variety of ...
research
03/05/2023

On Bayesian Generalized Additive Models

Generalized additive models (GAMs) provide a way to blend parametric and...

Please sign up or login with your details

Forgot password? Click here to reset