Fast Matrix Square Roots with Applications to Gaussian Processes and Bayesian Optimization

06/19/2020
by   Geoff Pleiss, et al.
9

Matrix square roots and their inverses arise frequently in machine learning, e.g., when sampling from high-dimensional Gaussians N(0, K) or whitening a vector b against covariance matrix K. While existing methods typically require O(N^3) computation, we introduce a highly-efficient quadratic-time algorithm for computing K^1/2b, K^-1/2b, and their derivatives through matrix-vector multiplication (MVMs). Our method combines Krylov subspace methods with a rational approximation and typically achieves 4 decimal places of accuracy with fewer than 100 MVMs. Moreover, the backward pass requires little additional computation. We demonstrate our method's applicability on matrices as large as 50,000 × 50,000 - well beyond traditional methods - with little approximation error. Applying this increased scalability to variational Gaussian processes, Bayesian optimization, and Gibbs sampling results in more powerful models with higher accuracy.

READ FULL TEXT

page 8

page 13

page 14

research
10/29/2018

Scaling Gaussian Process Regression with Derivatives

Gaussian processes (GPs) with derivatives are useful in many application...
research
06/11/2020

Fast increased fidelity approximate Gibbs samplers for Bayesian Gaussian process regression

The use of Gaussian processes (GPs) is supported by efficient sampling a...
research
02/25/2022

High-Dimensional Sparse Bayesian Learning without Covariance Matrices

Sparse Bayesian learning (SBL) is a powerful framework for tackling the ...
research
06/08/2021

The Fast Kernel Transform

Kernel methods are a highly effective and widely used collection of mode...
research
01/16/2020

Scalable Hyperparameter Optimization with Lazy Gaussian Processes

Most machine learning methods require careful selection of hyper-paramet...
research
03/28/2018

Quantum algorithms for training Gaussian Processes

Gaussian processes (GPs) are important models in supervised machine lear...
research
07/22/2021

Kernel-Matrix Determinant Estimates from stopped Cholesky Decomposition

Algorithms involving Gaussian processes or determinantal point processes...

Please sign up or login with your details

Forgot password? Click here to reset