Reduced-dimensional Monte Carlo Maximum Likelihood for Latent Gaussian Random Field Models

10/22/2019
by   Jaewoo Park, et al.
0

Monte Carlo maximum likelihood (MCML) provides an elegant approach to find maximum likelihood estimators (MLEs) for latent variable models. However, MCML algorithms are computationally expensive when the latent variables are high-dimensional and correlated, as is the case for latent Gaussian random field models. Latent Gaussian random field models are widely used, for example in building flexible regression models and in the interpolation of spatially dependent data in many research areas such as analyzing count data in disease modeling and presence-absence satellite images of ice sheets. We propose a computationally efficient MCML algorithm by using a projection-based approach to reduce the dimensions of the random effects. We develop an iterative method for finding an effective importance function; this is generally a challenging problem and is crucial for the MCML algorithm to be computationally feasible. We find that our method is applicable to both continuous (latent Gaussian process) and discrete domain (latent Gaussian Markov random field) models. We illustrate the application of our methods to challenging simulated and real data examples for which maximum likelihood estimation would otherwise be very challenging. Furthermore, we study an often overlooked challenge in MCML approaches to latent variable models: practical issues in calculating standard errors of the resulting estimates, and assessing whether resulting confidence intervals provide nominal coverage. Our study therefore provides useful insights into the details of implementing MCML algorithms for high-dimensional latent variable models.

READ FULL TEXT
research
09/12/2019

Fast expectation-maximization algorithms for spatial generalized linear mixed models

Spatial generalized linear mixed models (SGLMMs) are popular and flexibl...
research
07/13/2021

Graphical Laplace-approximated maximum likelihood estimation: approximated likelihood inference for network data analysis

We derive Laplace-approximated maximum likelihood estimators (GLAMLEs) o...
research
12/23/2015

A Latent-Variable Lattice Model

Markov random field (MRF) learning is intractable, and its approximation...
research
06/14/2016

Recursive nonlinear-system identification using latent variables

In this paper we develop a method for learning nonlinear systems with mu...
research
01/11/2018

Modeling High-Dimensional Data with Case-Control Sampling and Dependency Structures

Modern data sets in various domains often include units that were sample...
research
06/05/2023

Probabilistic Unrolling: Scalable, Inverse-Free Maximum Likelihood Estimation for Latent Gaussian Models

Latent Gaussian models have a rich history in statistics and machine lea...
research
04/05/2023

A Class of Models for Large Zero-inflated Spatial Data

Spatially correlated data with an excess of zeros, usually referred to a...

Please sign up or login with your details

Forgot password? Click here to reset