Scalable Log Determinants for Gaussian Process Kernel Learning

11/09/2017
by   Kun Dong, et al.
0

For applications as varied as Bayesian neural networks, determinantal point processes, elliptical graphical models, and kernel learning for Gaussian processes (GPs), one must compute a log determinant of an n × n positive definite matrix, and its derivatives - leading to prohibitive O(n^3) computations. We propose novel O(n) approaches to estimating these quantities from only fast matrix vector multiplications (MVMs). These stochastic approximations are based on Chebyshev, Lanczos, and surrogate models, and converge quickly even for kernel matrices that have challenging spectra. We leverage these approximations to develop a scalable Gaussian process approach to kernel learning. We find that Lanczos is generally superior to Chebyshev for kernel learning, and that a surrogate approach can be highly efficient and accurate with popular kernels.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/29/2018

Scaling Gaussian Process Regression with Derivatives

Gaussian processes (GPs) with derivatives are useful in many application...
research
02/21/2018

VBALD - Variational Bayesian Approximation of Log Determinants

Evaluating the log determinant of a positive definite matrix is ubiquito...
research
06/28/2021

Variance Reduction for Matrix Computations with Applications to Gaussian Processes

In addition to recent developments in computing speed and memory, method...
research
08/11/2022

Gaussian process surrogate models for neural networks

The lack of insight into deep learning systems hinders their systematic ...
research
12/31/2021

When are Iterative Gaussian Processes Reliably Accurate?

While recent work on conjugate gradient methods and Lanczos decompositio...
research
08/14/2022

A Scalable Method to Exploit Screening in Gaussian Process Models with Noise

A common approach to approximating Gaussian log-likelihoods at scale exp...
research
06/05/2020

Sparse Gaussian Processes via Parametric Families of Compactly-supported Kernels

Gaussian processes are powerful models for probabilistic machine learnin...

Please sign up or login with your details

Forgot password? Click here to reset