A piece-wise constant approximation for non-conjugate Gaussian Process models

04/22/2022
by   Sarem Seitz, et al.
0

Gaussian Processes (GPs) are a versatile and popular method in Bayesian Machine Learning. A common modification are Sparse Variational Gaussian Processes (SVGPs) which are well suited to deal with large datasets. While GPs allow to elegantly deal with Gaussian-distributed target variables in closed form, their applicability can be extended to non-Gaussian data as well. These extensions are usually impossible to treat in closed form and hence require approximate solutions. This paper proposes to approximate the inverse-link function, which is necessary when working with non-Gaussian likelihoods, by a piece-wise constant function. It will be shown that this yields a closed form solution for the corresponding SVGP lower bound. In addition, it is demonstrated how the piece-wise constant function itself can be optimized, resulting in an inverse-link function that can be learnt from the data at hand.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/18/2016

Chained Gaussian Processes

Gaussian process models are flexible, Bayesian non-parametric approaches...
research
10/28/2021

Conditioning Sparse Variational Gaussian Processes for Online Decision-making

With a principled representation of uncertainty and closed form posterio...
research
12/27/2020

A Tutorial on Sparse Gaussian Processes and Variational Inference

Gaussian processes (GPs) provide a framework for Bayesian inference that...
research
06/02/2021

Connections and Equivalences between the Nyström Method and Sparse Variational Gaussian Processes

We investigate the connections between sparse approximation methods for ...
research
03/12/2013

Gaussian Processes for Nonlinear Signal Processing

Gaussian processes (GPs) are versatile tools that have been successfully...
research
07/02/2015

Anomaly Detection and Removal Using Non-Stationary Gaussian Processes

This paper proposes a novel Gaussian process approach to fault removal i...
research
07/26/2019

Sequential Learning of Active Subspaces

In recent years, active subspace methods (ASMs) have become a popular me...

Please sign up or login with your details

Forgot password? Click here to reset