Barely Biased Learning for Gaussian Process Regression

09/20/2021
by   David R. Burt, et al.
0

Recent work in scalable approximate Gaussian process regression has discussed a bias-variance-computation trade-off when estimating the log marginal likelihood. We suggest a method that adaptively selects the amount of computation to use when estimating the log marginal likelihood so that the bias of the objective function is guaranteed to be small. While simple in principle, our current implementation of the method is not competitive computationally with existing approximations.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

02/21/2021

Inverse Gaussian Process regression for likelihood-free inference

In this work we consider Bayesian inference problems with intractable li...
02/12/2021

Bias-Free Scalable Gaussian Processes via Randomized Truncations

Scalable Gaussian Process methods are computationally attractive, yet in...
02/16/2021

Tighter Bounds on the Log Marginal Likelihood of Gaussian Process Regression Using Conjugate Gradients

We propose a lower bound on the log marginal likelihood of Gaussian proc...
01/14/2020

Robust Gaussian Process Regression with a Bias Model

This paper presents a new approach to a robust Gaussian process (GP) reg...
11/02/2019

Sparse inversion for derivative of log determinant

Algorithms for Gaussian process, marginal likelihood methods or restrict...
01/27/2022

A Probabilistic Framework for Dynamic Object Recognition in 3D Environment With A Novel Continuous Ground Estimation Method

In this thesis a probabilistic framework is developed and proposed for D...
06/04/2018

Composite Marginal Likelihood Methods for Random Utility Models

We propose a novel and flexible rank-breaking-then-composite-marginal-li...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.