Improving upon the effective sample size based on Godambe information for block likelihood inference

by   Rahul Mukerjee, et al.

We consider the effective sample size, based on Godambe information, for block likelihood inference which is an attractive and computationally feasible alternative to full likelihood inference for large correlated datasets. With reference to a Gaussian random field having a constant mean, we explore how the choice of blocks impacts this effective sample size. It is seen that spreading out the spatial points within each block, instead of keeping them close together, can lead to considerable gains while retaining computational simplicity. Analytical results in this direction are obtained under the AR(1) model. The insights so found facilitate the study of other models, including correlation models on a plane, where closed form expressions are intractable.


page 1

page 2

page 3

page 4


Optimal sample size for the Birnbaum-Saunders distribution under a decision-theoretic approach

The Birnbaum-Saunders distribution has been widely applied in several ar...

Quantifying Observed Prior Impact

We distinguish two questions (i) how much information does the prior con...

Efficient closed-form estimation of large spatial autoregressions

Newton-step approximations to pseudo maximum likelihood estimates of spa...

Second Order Expansions for Sample Median with Random Sample Size

In practice, we often encounter situations where a sample size is not de...

Gaussian orthogonal latent factor processes for large incomplete matrices of correlated data

We introduce the Gaussian orthogonal latent factor processes for modelin...

Effective sample size: a measure of individual uncertainty in predictions

Clinical prediction models are estimated using a sample of limited size ...

Measuring the accuracy of likelihood-free inference

Complex scientific models where the likelihood cannot be evaluated prese...

Please sign up or login with your details

Forgot password? Click here to reset