Scalable Gaussian Processes with Grid-Structured Eigenfunctions (GP-GRIEF)

07/05/2018
by   Trefor W. Evans, et al.
0

We introduce a kernel approximation strategy that enables computation of the Gaussian process log marginal likelihood and all hyperparameter derivatives in O(p) time. Our GRIEF kernel consists of p eigenfunctions found using a Nyström approximation from a dense Cartesian product grid of inducing points. By exploiting algebraic properties of Kronecker and Khatri-Rao tensor products, computational complexity of the training procedure can be practically independent of the number of inducing points. This allows us to use arbitrarily many inducing points to achieve a globally accurate kernel approximation, even in high-dimensional problems. The fast likelihood evaluation enables type-I or II Bayesian inference on large-scale datasets. We benchmark our algorithms on real-world problems with up to two-million training points and 10^33 inducing points.

READ FULL TEXT
research
11/05/2015

Thoughts on Massively Scalable Gaussian Processes

We introduce a framework and early results for massively scalable Gaussi...
research
07/16/2022

Learning inducing points and uncertainty on molecular data

Uncertainty control and scalability to large datasets are the two main i...
research
02/28/2021

Hierarchical Inducing Point Gaussian Process for Inter-domain Observations

We examine the general problem of inter-domain Gaussian Processes (GPs):...
research
09/06/2017

Convolutional Gaussian Processes

We present a practical way of introducing convolutional structure into G...
research
01/28/2021

Faster Kernel Interpolation for Gaussian Processes

A key challenge in scaling Gaussian Process (GP) regression to massive d...
research
01/31/2018

Kernel Distillation for Gaussian Processes

Gaussian processes (GPs) are flexible models that can capture complex st...

Please sign up or login with your details

Forgot password? Click here to reset