DeepAI AI Chat
Log In Sign Up

Scalable Gaussian Process Inference with Stan

by   Till Hoffmann, et al.

Gaussian processes (GPs) are sophisticated distributions to model functional data. Whilst theoretically appealing, they are computationally cumbersome except for small datasets. We implement two methods for scaling GP inference in Stan: First, a general sparse approximation using a directed acyclic dependency graph. Second, a fast, exact method for regularly spaced data modeled by GPs with stationary kernels using the fast Fourier transform. Based on benchmark experiments, we offer guidance for practitioners to decide between different methods and parameterizations. We consider two real-world examples to illustrate the package. The implementation follows Stan's design and exposes performant inference through a familiar interface. Full posterior inference for ten thousand data points is feasible on a laptop in less than 20 seconds.


Radial Neighbors for Provably Accurate Scalable Approximations of Gaussian Processes

In geostatistical problems with massive sample size, Gaussian processes ...

Exact Gaussian Processes on a Million Data Points

Gaussian processes (GPs) are flexible models with state-of-the-art perfo...

Learning Deep Mixtures of Gaussian Process Experts Using Sum-Product Networks

While Gaussian processes (GPs) are the method of choice for regression t...

Fast and Scalable Spike and Slab Variable Selection in High-Dimensional Gaussian Processes

Variable selection in Gaussian processes (GPs) is typically undertaken b...

Equispaced Fourier representations for efficient Gaussian process regression from a billion data points

We introduce a Fourier-based fast algorithm for Gaussian process regress...

Gaussian Processes Over Graphs

We propose Gaussian processes for signals over graphs (GPG) using the ap...

Vecchia-approximated Deep Gaussian Processes for Computer Experiments

Deep Gaussian processes (DGPs) upgrade ordinary GPs through functional c...