The Rényi Gaussian Process

10/15/2019
by   Raed Kontar, et al.
0

In this article we introduce an alternative closed form lower bound on the Gaussian process (GP) likelihood based on the Rényi α-divergence. This new lower bound can be viewed as a convex combination of the Nyström approximation and the exact GP. The key advantage of this bound, is its capability to control and tune the enforced regularization on the model and thus is a generalization of the traditional sparse variational GP regression. From the theoretical perspective, we show that with probability at least 1-δ, the Rényi α-divergence between the variational distribution and the true posterior becomes arbitrarily small as the number of data points increase.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/02/2012

Deep Gaussian Processes

In this paper we introduce deep Gaussian process (GP) models. Deep GPs a...
research
03/08/2019

Rates of Convergence for Sparse Variational Gaussian Process Regression

Excellent variational approximations to Gaussian process posteriors have...
research
11/02/2017

Deep Recurrent Gaussian Process with Variational Sparse Spectrum Approximation

Modeling sequential data has become more and more important in practice....
research
10/16/2022

Streaming PAC-Bayes Gaussian process regression with a performance guarantee for online decision making

As a powerful Bayesian non-parameterized algorithm, the Gaussian process...
research
09/18/2017

Variational Gaussian Approximation for Poisson Data

The Poisson model is frequently employed to describe count data, but in ...
research
02/28/2023

Interactive Segmentation as Gaussian Process Classification

Click-based interactive segmentation (IS) aims to extract the target obj...
research
05/30/2017

Identification of Gaussian Process State Space Models

The Gaussian process state space model (GPSSM) is a non-linear dynamical...

Please sign up or login with your details

Forgot password? Click here to reset