A Note on the comparison of Nearest Neighbor Gaussian Process (NNGP) based models

11/09/2018
by   Lu Zhang, et al.
0

This note is devoted to the comparison between two Nearest-neighbor Gaussian processes (NNGP) based models: the response NNGP model and the latent NNGP model. We exhibit that the comparison based on the Kullback-Leibler divergence (KL-D) from the NNGP based models to their parent GP based model can result in reverse conclusions in different parameter spaces. And we suggest a heuristic explanation on the phenomenon that the latent NNGP model tends to outperform the response NNGP model in approximating their parent GP based model.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/03/2022

Variational Nearest Neighbor Gaussian Processes

Variational approximations to Gaussian processes (GPs) typically use a s...
research
02/26/2021

Sparse Cholesky matrices in spatial statistics

Gaussian Processes (GP) is a staple in the toolkit of a spatial statisti...
research
05/24/2021

Scalable Cross Validation Losses for Gaussian Process Models

We introduce a simple and scalable method for training Gaussian process ...
research
08/04/2016

Bayesian Kernel and Mutual k-Nearest Neighbor Regression

We propose Bayesian extensions of two nonparametric regression methods w...
research
03/26/2021

Variable Selection Using Nearest Neighbor Gaussian Processes

A novel Bayesian approach to the problem of variable selection using Gau...
research
10/02/2020

Improving performances of MCMC for Nearest Neighbor Gaussian Process models with full data augmentation

Even though Nearest Neighbor Gaussian Processes (NNGP) alleviate conside...
research
04/05/2021

Revisiting Rashomon: A Comment on "The Two Cultures"

Here, I provide some reflections on Prof. Leo Breiman's "The Two Culture...

Please sign up or login with your details

Forgot password? Click here to reset