Fast increased fidelity approximate Gibbs samplers for Bayesian Gaussian process regression

06/11/2020
by   Kelly R. Moran, et al.
0

The use of Gaussian processes (GPs) is supported by efficient sampling algorithms, a rich methodological literature, and strong theoretical grounding. However, due to their prohibitive computation and storage demands, the use of exact GPs in Bayesian models is limited to problems containing at most several thousand observations. Sampling requires matrix operations that scale at 𝒪(n^3), where n is the number of unique inputs. Storage of individual matrices scales at 𝒪(n^2), and can quickly overwhelm the resources of most modern computers. To overcome these bottlenecks, we develop a sampling algorithm using ℋ matrix approximation of the matrices comprising the GP posterior covariance. These matrices can approximate the true conditional covariance matrix within machine precision and allow for sampling algorithms that scale at 𝒪(n ^2 n) time and storage demands scaling at 𝒪(n n). We also describe how these algorithms can be used as building blocks to model higher dimensional surfaces at 𝒪(d n ^2 n), where d is the dimension of the surface under consideration, using tensor products of one-dimensional GPs. Though various scalable processes have been proposed for approximating Bayesian GP inference when n is large, to our knowledge, none of these methods show that the approximation's Kullback-Leibler divergence to the true posterior can be made arbitrarily small and may be no worse than the approximation provided by finite computer arithmetic. We describe ℋ-matrices, give an efficient Gibbs sampler using these matrices for one-dimensional GPs, offer a proposed extension to higher dimensional surfaces, and investigate the performance of this fast increased fidelity approximate GP, FIFA-GP, using both simulated and real data sets.

READ FULL TEXT

page 18

page 34

page 35

page 37

page 38

page 39

page 40

research
06/12/2015

MCMC for Variationally Sparse Gaussian Processes

Gaussian process (GP) models form a core part of probabilistic machine l...
research
05/27/2020

Precision Aggregated Local Models

Large scale Gaussian process (GP) regression is infeasible for larger da...
research
06/19/2020

Fast Matrix Square Roots with Applications to Gaussian Processes and Bayesian Optimization

Matrix square roots and their inverses arise frequently in machine learn...
research
09/18/2012

Scaling Multidimensional Inference for Structured Gaussian Processes

Exact Gaussian Process (GP) regression has O(N^3) runtime for data size ...
research
09/09/2020

Sequential construction and dimension reduction of Gaussian processes under inequality constraints

Accounting for inequality constraints, such as boundedness, monotonicity...
research
03/15/2012

Speeding up the binary Gaussian process classification

Gaussian processes (GP) are attractive building blocks for many probabil...

Please sign up or login with your details

Forgot password? Click here to reset