A piece-wise constant approximation for non-conjugate Gaussian Process models

by   Sarem Seitz, et al.

Gaussian Processes (GPs) are a versatile and popular method in Bayesian Machine Learning. A common modification are Sparse Variational Gaussian Processes (SVGPs) which are well suited to deal with large datasets. While GPs allow to elegantly deal with Gaussian-distributed target variables in closed form, their applicability can be extended to non-Gaussian data as well. These extensions are usually impossible to treat in closed form and hence require approximate solutions. This paper proposes to approximate the inverse-link function, which is necessary when working with non-Gaussian likelihoods, by a piece-wise constant function. It will be shown that this yields a closed form solution for the corresponding SVGP lower bound. In addition, it is demonstrated how the piece-wise constant function itself can be optimized, resulting in an inverse-link function that can be learnt from the data at hand.



page 1

page 2

page 3

page 4


Chained Gaussian Processes

Gaussian process models are flexible, Bayesian non-parametric approaches...

Conditioning Sparse Variational Gaussian Processes for Online Decision-making

With a principled representation of uncertainty and closed form posterio...

Deep Variational Implicit Processes

Implicit processes (IPs) are a generalization of Gaussian processes (GPs...

A Tutorial on Sparse Gaussian Processes and Variational Inference

Gaussian processes (GPs) provide a framework for Bayesian inference that...

Connections and Equivalences between the Nyström Method and Sparse Variational Gaussian Processes

We investigate the connections between sparse approximation methods for ...

Sequential Learning of Active Subspaces

In recent years, active subspace methods (ASMs) have become a popular me...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.