Quantile Propagation for Wasserstein-Approximate Gaussian Processes

12/21/2019
by   Rui Zhang, et al.
13

In this work, we develop a new approximation method to solve the analytically intractable Bayesian inference for Gaussian process models with factorizable Gaussian likelihoods and single-output latent functions. Our method – dubbed QP – is similar to the expectation propagation (EP), however it minimizes the L^2 Wasserstein distance instead of the Kullback-Leibler (KL) divergence. We consider the specific case in which the non-Gaussian likelihood is approximated by the Gaussian likelihood. We show that QP has the following properties: (1) QP matches quantile functions rather than moments in EP; (2) QP and EP have the same local update for the mean of the approximate Gaussian likelihood; (3) the local variance estimate for the approximate likelihood is smaller for QP than for EP's, addressing EP's over-estimation of the variance; (4) the optimal approximate Gaussian likelihood enjoys a univariate parameterization, reducing memory consumption and computation time. Furthermore, we provide a unified interpretations of EP and QP – both are coordinate descent algorithms of a KL and an L^2 Wasserstein global objective function respectively, under the same assumptions. In the performed experiments, we employ eight real world datasets and we show that QP outperforms EP for the task of Gaussian process binary classification.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/11/2022

Towards Improved Learning in Gaussian Processes: The Best of Two Worlds

Gaussian process training decomposes into inference of the (approximate)...
research
02/21/2021

Inverse Gaussian Process regression for likelihood-free inference

In this work we consider Bayesian inference problems with intractable li...
research
01/12/2013

Perturbative Corrections for Approximate Inference in Gaussian Latent Variable Models

Expectation Propagation (EP) provides a framework for approximate infere...
research
06/26/2018

Scalable Gaussian Process Inference with Finite-data Mean and Variance Guarantees

Gaussian processes (GPs) offer a flexible class of priors for nonparamet...
research
06/22/2017

Scalable Multi-Class Gaussian Process Classification using Expectation Propagation

This paper describes an expectation propagation (EP) method for multi-cl...
research
09/13/2018

Gaussian process classification using posterior linearisation

This paper proposes a new algorithm for Gaussian process classification ...
research
04/18/2012

Message passing with relaxed moment matching

Bayesian learning is often hampered by large computational expense. As a...

Please sign up or login with your details

Forgot password? Click here to reset