Sparse Cholesky factorization by Kullback-Leibler minimization

04/29/2020
by   Florian Schäfer, et al.
0

We propose to compute a sparse approximate inverse Cholesky factor L of a dense covariance matrix Θ by minimizing the Kullback-Leibler divergence between the Gaussian distributions 𝒩(0, Θ) and 𝒩(0, L^-⊤ L^-1), subject to a sparsity constraint. Surprisingly, this problem has a closed-form solution that can be computed efficiently, recovering the popular Vecchia approximation in spatial statistics. Based on recent results on the approximate sparsity of inverse Cholesky factors of Θ obtained from pairwise evaluation of Green's functions of elliptic boundary-value problems at points {x_i}_1 ≤ i ≤ N⊂ℝ^d, we propose an elimination ordering and sparsity pattern that allows us to compute ϵ-approximate inverse Cholesky factors of such Θ in computational complexity 𝒪(N log(N/ϵ)^d) in space and 𝒪(N log(N/ϵ)^2d) in time. To the best of our knowledge, this is the best asymptotic complexity for this class of problems. Furthermore, our method is embarrassingly parallel, automatically exploits low-dimensional structure in the data, and can perform Gaussian-process regression in linear (in N) space complexity. Motivated by the optimality properties of our methods, we propose methods for applying it to the joint covariance of training and prediction points in Gaussian-process regression, greatly improving stability and computational cost. Finally, we show how to apply our method to the important setting of Gaussian processes with additive noise, sacrificing neither accuracy nor computational complexity.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/21/2023

Sparse Cholesky factorization by greedy conditional selection

Dense kernel matrices resulting from pairwise evaluations of a kernel fu...
research
08/22/2020

Fast Approximate Multi-output Gaussian Processes

Gaussian processes regression models are an appealing machine learning m...
research
01/30/2023

Variational sparse inverse Cholesky approximation for latent Gaussian processes via double Kullback-Leibler minimization

To achieve scalable and accurate inference for latent Gaussian processes...
research
12/31/2017

Low-Cost Bayesian Inference for Additive Approximate Gaussian Process

Gaussian process models have been widely used in spatial/spatio-temporal...
research
08/01/2020

Convergence of Sparse Variational Inference in Gaussian Processes Regression

Gaussian processes are distributions over functions that are versatile a...
research
10/25/2016

Parallelizable sparse inverse formulation Gaussian processes (SpInGP)

We propose a parallelizable sparse inverse formulation Gaussian process ...
research
01/11/2018

On the precision matrix of an irregularly sampled AR(1) process

This text presents an analytical expression for the inverse covariance m...

Please sign up or login with your details

Forgot password? Click here to reset