Laplace approximation and the natural gradient for Gaussian process regression with the heteroscedastic Student-t model

12/20/2017
by   Marcelo Hartmann, et al.
0

This paper considers the Laplace method to derive approximate inference for the Gaussian process (GP) regression in the location and scale parameters of the Student-t probabilistic model. This allows both mean and variance of the data to vary as a function of covariates with the attractive feature that the Student-t model has been widely used as a useful tool for robustifying data analysis. The challenge in the approximate inference for the GP regression with the Student-t probabilistic model, lies in the analytical intractability of the posterior distribution and the lack of concavity of the log-likelihood function. We present the natural gradient adaptation for the estimation process which primarily relies on the property that the Student-t model naturally has orthogonal parametrization with respect to the location and scale paramaters. Due to this particular property of the model, we also introduce an alternative Laplace approximation by using the Fisher information matrix in place of the Hessian matrix of the negative log-likelihood function. According to experiments this alternative approximation provides very similar posterior approximations and predictive performance when compared to the traditional Laplace approximation. We also compare both of these Laplace approximations with the Monte Carlo Markov Chain (MCMC) method. Moreover, we compare our heteroscedastic Student-t model and the GP regression with the heteroscedastic Gaussian model. We also discuss how our approach can improve the inference algorithm in cases where the probabilistic model assumed for the data is not log-concave.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/22/2011

Gaussian Process Regression with a Student-t Likelihood

This paper considers the robust and efficient implementation of Gaussian...
research
02/25/2022

Scalable Gaussian-process regression and variable selection using Vecchia approximations

Gaussian process (GP) regression is a flexible, nonparametric approach t...
research
09/28/2021

Gaussian Processes to speed up MCMC with automatic exploratory-exploitation effect

We present a two-stage Metropolis-Hastings algorithm for sampling probab...
research
04/25/2022

The Galactic 3D large-scale dust distribution via Gaussian process regression on spherical coordinates

Knowing the Galactic 3D dust distribution is relevant for understanding ...
research
11/09/2021

Gaussian Process Meta Few-shot Classifier Learning via Linear Discriminant Laplace Approximation

The meta learning few-shot classification is an emerging problem in mach...
research
03/09/2023

Curvature-Sensitive Predictive Coding with Approximate Laplace Monte Carlo

Predictive coding (PC) accounts of perception now form one of the domina...
research
11/30/2021

Log-Gaussian Cox Process Modeling of Large Spatial Lightning Data using Spectral and Laplace Approximations

Lightning is a destructive and highly visible product of severe storms, ...

Please sign up or login with your details

Forgot password? Click here to reset