VBALD - Variational Bayesian Approximation of Log Determinants

02/21/2018
by   Diego Granziol, et al.
0

Evaluating the log determinant of a positive definite matrix is ubiquitous in machine learning. Applications thereof range from Gaussian processes, minimum-volume ellipsoids, metric learning, kernel learning, Bayesian neural networks, Determinental Point Processes, Markov random fields to partition functions of discrete graphical models. In order to avoid the canonical, yet prohibitive, Cholesky O(n^3) computational cost, we propose a novel approach, with complexity O(n^2), based on a constrained variational Bayes algorithm. We compare our method to Taylor, Chebyshev and Lanczos approaches and show state of the art performance on both synthetic and real-world datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/09/2017

Scalable Log Determinants for Gaussian Process Kernel Learning

For applications as varied as Bayesian neural networks, determinantal po...
research
04/05/2017

Bayesian Inference of Log Determinants

The log-determinant of a kernel matrix appears in a variety of machine l...
research
02/08/2022

Improved Convergence Rates for Sparse Approximation Methods in Kernel-Based Learning

Kernel-based models such as kernel ridge regression and Gaussian process...
research
04/24/2017

Entropic Trace Estimates for Log Determinants

The scalable calculation of matrix determinants has been a bottleneck to...
research
03/14/2018

Bucket Renormalization for Approximate Inference

Probabilistic graphical models are a key tool in machine learning applic...
research
01/24/2019

Adversarial Variational Inference and Learning in Markov Random Fields

Markov random fields (MRFs) find applications in a variety of machine le...
research
10/04/2018

Markov Properties of Discrete Determinantal Point Processes

Determinantal point processes (DPPs) are probabilistic models for repuls...

Please sign up or login with your details

Forgot password? Click here to reset