Product Kernel Interpolation for Scalable Gaussian Processes

02/24/2018
by   Jacob R. Gardner, et al.
0

Recent work shows that inference for Gaussian processes can be performed efficiently using iterative methods that rely only on matrix-vector multiplications (MVMs). Structured Kernel Interpolation (SKI) exploits these techniques by deriving approximate kernels with very fast MVMs. Unfortunately, such strategies suffer badly from the curse of dimensionality. We develop a new technique for MVM based learning that exploits product kernel structure. We demonstrate that this technique is broadly applicable, resulting in linear rather than exponential runtime with dimension for SKI, as well as state-of-the-art asymptotic complexity for multi-task GPs.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/12/2021

SKIing on Simplices: Kernel Interpolation on the Permutohedral Lattice for Scalable Gaussian Processes

State-of-the-art methods for scalable Gaussian processes use iterative a...
research
12/16/2019

Kernel-based interpolation at approximate Fekete points

We construct approximate Fekete point sets for kernel-based interpolatio...
research
12/31/2021

When are Iterative Gaussian Processes Reliably Accurate?

While recent work on conjugate gradient methods and Lanczos decompositio...
research
06/16/2023

Amortized Inference for Gaussian Process Hyperparameters of Structured Kernels

Learning the kernel parameters for Gaussian processes is often the compu...
research
10/19/2020

Characterizing Deep Gaussian Processes via Nonlinear Recurrence Systems

Recent advances in Deep Gaussian Processes (DGPs) show the potential to ...
research
01/31/2018

Kernel Distillation for Gaussian Processes

Gaussian processes (GPs) are flexible models that can capture complex st...
research
10/28/2020

Hierarchical Gaussian Processes with Wasserstein-2 Kernels

We investigate the usefulness of Wasserstein-2 kernels in the context of...

Please sign up or login with your details

Forgot password? Click here to reset