Improving performances of MCMC for Nearest Neighbor Gaussian Process models with full data augmentation
Even though Nearest Neighbor Gaussian Processes (NNGP) alleviate considerably MCMC implementation of Bayesian space-time models, they do not solve the convergence problems caused by high model dimension. Frugal alternatives such as response or collapsed algorithms are an answer.gree Our approach is to keep full data augmentation but to try and make it more efficient. We present two strategies to do so. The first scheme is to pay a particular attention to the seemingly trivial fixed effects of the model. We show empirically that re-centering the latent field on the intercept critically improves chain behavior. We extend this approach to other fixed effects that may interfere with a coherent spatial field. We propose a simple method that requires no tuning while remaining affordable thanks to NNGP's sparsity. The second scheme accelerates the sampling of the random field using Chromatic samplers. This method makes long sequential simulation boil down to group-parallelized or group-vectorized sampling. The attractive possibility to parallelize NNGP likelihood can therefore be carried over to field sampling. We present a R implementation of our methods for Gaussian fields in the public repository https://github.com/SebastienCoube/Improving_NNGP_full_augmentation . An extensive vignette is provided. We run our implementation on two synthetic toy examples along with the state of the art package spNNGP. Finally, we apply our method on a real data set of lead contamination in the United States of America mainland.
READ FULL TEXT