Sparse Cholesky factorization by greedy conditional selection

07/21/2023
by   Stephen Huan, et al.
0

Dense kernel matrices resulting from pairwise evaluations of a kernel function arise naturally in machine learning and statistics. Previous work in constructing sparse approximate inverse Cholesky factors of such matrices by minimizing Kullback-Leibler divergence recovers the Vecchia approximation for Gaussian processes. These methods rely only on the geometry of the evaluation points to construct the sparsity pattern. In this work, we instead construct the sparsity pattern by leveraging a greedy selection algorithm that maximizes mutual information with target points, conditional on all points previously selected. For selecting k points out of N, the naive time complexity is 𝒪(N k^4), but by maintaining a partial Cholesky factor we reduce this to 𝒪(N k^2). Furthermore, for multiple (m) targets we achieve a time complexity of 𝒪(N k^2 + N m^2 + m^3), which is maintained in the setting of aggregated Cholesky factorization where a selected point need not condition every target. We apply the selection algorithm to image classification and recovery of sparse Cholesky factors. By minimizing Kullback-Leibler divergence, we apply the algorithm to Cholesky factorization, Gaussian process regression, and preconditioning with the conjugate gradient, improving over k-nearest neighbors selection.

READ FULL TEXT

page 4

page 39

page 40

research
04/29/2020

Sparse Cholesky factorization by Kullback-Leibler minimization

We propose to compute a sparse approximate inverse Cholesky factor L of ...
research
11/17/2018

A Greedy approximation scheme for Sparse Gaussian process regression

In their standard form Gaussian processes (GPs) provide a powerful non-p...
research
09/08/2020

Approximate Multiplication of Sparse Matrices with Limited Space

Approximate matrix multiplication with limited space has received ever-i...
research
09/07/2018

Fast greedy algorithms for dictionary selection with generalized sparsity constraints

In dictionary selection, several atoms are selected from finite candidat...
research
10/25/2016

Parallelizable sparse inverse formulation Gaussian processes (SpInGP)

We propose a parallelizable sparse inverse formulation Gaussian process ...
research
12/29/2021

Correlation-based sparse inverse Cholesky factorization for fast Gaussian-process inference

Gaussian processes are widely used as priors for unknown functions in st...
research
07/07/2021

Samplets: A new paradigm for data compression

In this article, we introduce the concept of samplets by transferring th...

Please sign up or login with your details

Forgot password? Click here to reset