Barely Biased Learning for Gaussian Process Regression

09/20/2021
by   David R. Burt, et al.
0

Recent work in scalable approximate Gaussian process regression has discussed a bias-variance-computation trade-off when estimating the log marginal likelihood. We suggest a method that adaptively selects the amount of computation to use when estimating the log marginal likelihood so that the bias of the objective function is guaranteed to be small. While simple in principle, our current implementation of the method is not competitive computationally with existing approximations.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/21/2021

Inverse Gaussian Process regression for likelihood-free inference

In this work we consider Bayesian inference problems with intractable li...
research
02/12/2021

Bias-Free Scalable Gaussian Processes via Randomized Truncations

Scalable Gaussian Process methods are computationally attractive, yet in...
research
02/16/2021

Tighter Bounds on the Log Marginal Likelihood of Gaussian Process Regression Using Conjugate Gradients

We propose a lower bound on the log marginal likelihood of Gaussian proc...
research
11/02/2019

Sparse inversion for derivative of log determinant

Algorithms for Gaussian process, marginal likelihood methods or restrict...
research
10/29/2011

Efficient Marginal Likelihood Computation for Gaussian Process Regression

In a Bayesian learning setting, the posterior distribution of a predicti...
research
07/16/2022

A Singular Woodbury and Pseudo-Determinant Matrix Identities and Application to Gaussian Process Regression

We study a matrix that arises in a singular formulation of the Woodbury ...
research
01/04/2023

A Scalable Gaussian Process for Large-Scale Periodic Data

The periodic Gaussian process (PGP) has been increasingly used to model ...

Please sign up or login with your details

Forgot password? Click here to reset