General linear-time inference for Gaussian Processes on one dimension

03/11/2020
by   Jackson Loper, et al.
9

Gaussian Processes (GPs) provide a powerful probabilistic framework for interpolation, forecasting, and smoothing, but have been hampered by computational scaling issues. Here we prove that for data sampled on one dimension (e.g., a time series sampled at arbitrarily-spaced intervals), approximate GP inference at any desired level of accuracy requires computational effort that scales linearly with the number of observations; this new theorem enables inference on much larger datasets than was previously feasible. To achieve this improved scaling we propose a new family of stationary covariance kernels: the Latent Exponentially Generated (LEG) family, which admits a convenient stable state-space representation that allows linear-time inference. We prove that any continuous integrable stationary kernel can be approximated arbitrarily well by some member of the LEG family. The proof draws connections to Spectral Mixture Kernels, providing new insight about the flexibility of this popular family of kernels. We propose parallelized algorithms for performing inference and learning in the LEG model, test the algorithm on real and synthetic data, and demonstrate scaling to datasets with billions of samples.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/07/2016

State Space representation of non-stationary Gaussian Processes

The state space (SS) representation of Gaussian processes (GP) has recen...
research
09/01/2022

Bézier Gaussian Processes for Tall and Wide Data

Modern approximations to Gaussian processes are suitable for "tall data"...
research
06/12/2021

SKIing on Simplices: Kernel Interpolation on the Permutohedral Lattice for Scalable Gaussian Processes

State-of-the-art methods for scalable Gaussian processes use iterative a...
research
10/27/2015

Blitzkriging: Kronecker-structured Stochastic Gaussian Processes

We present Blitzkriging, a new approach to fast inference for Gaussian p...
research
06/07/2015

String Gaussian Process Kernels

We introduce a new class of nonstationary kernels, which we derive as co...
research
10/17/2018

A natural 4-parameter family of covariance functions for stationary Gaussian processes

A four-parameter family of covariance functions for stationary Gaussian ...
research
04/21/2019

Linear Multiple Low-Rank Kernel Based Stationary Gaussian Processes Regression for Time Series

Gaussian processes (GP) for machine learning have been studied systemati...

Please sign up or login with your details

Forgot password? Click here to reset