Gaussian process classification using posterior linearisation

This paper proposes a new algorithm for Gaussian process classification based on posterior linearisation (PL). In PL, a Gaussian approximation to the posterior density is obtained iteratively using the best possible linearisation of the conditional mean of the labels and accounting for the linearisation error. Considering three widely-used likelihood functions, in general, PL provides lower classification errors in real data sets than expectation propagation and Laplace algorithms.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/21/2019

Gaussian Process Regression and Classification under Mathematical Constraints with Learning Guarantees

We introduce constrained Gaussian process (CGP), a Gaussian process mode...
research
07/16/2012

Nested Expectation Propagation for Gaussian Process Classification with a Multinomial Probit Likelihood

We consider probabilistic multinomial probit classification using Gaussi...
research
05/16/2022

On the inability of Gaussian process regression to optimally learn compositional functions

We rigorously prove that deep Gaussian process priors can outperform Gau...
research
05/27/2016

Merging MCMC Subposteriors through Gaussian-Process Approximations

Markov chain Monte Carlo (MCMC) algorithms have become powerful tools fo...
research
11/11/2022

Towards Improved Learning in Gaussian Processes: The Best of Two Worlds

Gaussian process training decomposes into inference of the (approximate)...
research
01/21/2014

Hilbert Space Methods for Reduced-Rank Gaussian Process Regression

This paper proposes a novel scheme for reduced-rank Gaussian process reg...
research
12/21/2019

Quantile Propagation for Wasserstein-Approximate Gaussian Processes

In this work, we develop a new approximation method to solve the analyti...

Please sign up or login with your details

Forgot password? Click here to reset