Practical Hilbert space approximate Bayesian Gaussian processes for probabilistic programming

by   Gabriel Riutort-Mayol, et al.

Gaussian processes are powerful non-parametric probabilistic models for stochastic functions. However they entail a complexity that is computationally intractable when the number of observations is large, especially when estimated with fully Bayesian methods such as Markov chain Monte Carlo. In this paper, we focus on a novel approach for low-rank approximate Bayesian Gaussian processes, based on a basis function approximation via Laplace eigenfunctions for stationary covariance functions. The main contribution of this paper is a detailed analysis of the performance and practical implementation of the method in relation to key factors such as the number of basis functions, domain of the prediction space, and smoothness of the latent function. We provide intuitive visualizations and recommendations for choosing the values of these factors, which make it easier for users to improve approximation accuracy and computational performance. We also propose diagnostics for checking that the number of basis functions and the domain of the prediction space are adequate given the data. The proposed approach is simple and exhibits an attractive computational complexity due to its linear structure, and it is easy to implement in probabilistic programming frameworks. Several illustrative examples of the performance and applicability of the method in the probabilistic programming language Stan are presented together with the underlying Stan model code.


page 1

page 2

page 3

page 4


Bayesian Modeling with Gaussian Processes using the GPstuff Toolbox

Gaussian processes (GP) are powerful tools for probabilistic modeling pu...

Semivariogram Hyper-Parameter Estimation for Whittle-Matérn Priors in Bayesian Inverse Problems

We present a detailed mathematical description of the connection between...

Approximate Sampling using an Accelerated Metropolis-Hastings based on Bayesian Optimization and Gaussian Processes

Markov Chain Monte Carlo (MCMC) methods have a drawback when working wit...

Bayesian Kernelized Tensor Factorization as Surrogate for Bayesian Optimization

Bayesian optimization (BO) primarily uses Gaussian processes (GP) as the...

Spectrum Gaussian Processes Based On Tunable Basis Functions

Spectral approximation and variational inducing learning for the Gaussia...

Bézier Curve Gaussian Processes

Probabilistic models for sequential data are the basis for a variety of ...

On Bayesian Generalized Additive Models

Generalized additive models (GAMs) provide a way to blend parametric and...

Please sign up or login with your details

Forgot password? Click here to reset