Representing Additive Gaussian Processes by Sparse Matrices

04/29/2023
by   Lu Zou, et al.
0

Among generalized additive models, additive Matérn Gaussian Processes (GPs) are one of the most popular for scalable high-dimensional problems. Thanks to their additive structure and stochastic differential equation representation, back-fitting-based algorithms can reduce the time complexity of computing the posterior mean from O(n^3) to O(nlog n) time where n is the data size. However, generalizing these algorithms to efficiently compute the posterior variance and maximum log-likelihood remains an open problem. In this study, we demonstrate that for Additive Matérn GPs, not only the posterior mean, but also the posterior variance, log-likelihood, and gradient of these three functions can be represented by formulas involving only sparse matrices and sparse vectors. We show how to use these sparse formulas to generalize back-fitting-based algorithms to efficiently compute the posterior mean, posterior variance, log-likelihood, and gradient of these three functions for additive GPs, all in O(n log n) time. We apply our algorithms to Bayesian optimization and propose efficient algorithms for posterior updates, hyperparameters learning, and computations of the acquisition function and its gradient in Bayesian optimization. Given the posterior, our algorithms significantly reduce the time complexity of computing the acquisition function and its gradient from O(n^2) to O(log n) for general learning rate, and even to O(1) for small learning rate.

READ FULL TEXT
research
12/28/2018

Scalable GAM using sparse variational Gaussian processes

Generalized additive models (GAMs) are a widely used class of models of ...
research
11/02/2022

Fantasizing with Dual GPs in Bayesian Optimization and Active Learning

Gaussian processes (GPs) are the main surrogate functions used for seque...
research
09/03/2019

A Diffusion Process Perspective on Posterior Contraction Rates for Parameters

We show that diffusion processes can be exploited to study the posterior...
research
05/31/2023

Relaxing the Additivity Constraints in Decentralized No-Regret High-Dimensional Bayesian Optimization

Bayesian Optimization (BO) is typically used to optimize an unknown func...
research
03/02/2021

Kernel Interpolation for Scalable Online Gaussian Processes

Gaussian processes (GPs) provide a gold standard for performance in onli...
research
06/28/2021

Variance Reduction for Matrix Computations with Applications to Gaussian Processes

In addition to recent developments in computing speed and memory, method...
research
06/04/2019

Posterior Variance Analysis of Gaussian Processes with Application to Average Learning Curves

The posterior variance of Gaussian processes is a valuable measure of th...

Please sign up or login with your details

Forgot password? Click here to reset