Estimating Fisher Information Matrix in Latent Variable Models based on the Score Function

09/13/2019
by   Maud Delattre, et al.
0

The Fisher information matrix (FIM) is a key quantity in statistics as it is required for example for evaluating asymptotic precisions of parameter estimates, for computing test statistics or asymptotic distributions in statistical testing, for evaluating post model selection inference results or optimality criteria in experimental designs. However its exact computation is often not trivial. In particular in many latent variable models, it is intricated due to the presence of unobserved variables. Therefore the observed FIM is usually considered in this context to estimate the FIM. Several methods have been proposed to approximate the observed FIM when it can not be evaluated analytically. Among the most frequently used approaches are Monte-Carlo methods or iterative algorithms derived from the missing information principle. All these methods require to compute second derivatives of the complete data log-likelihood which leads to some disadvantages from a computational point of view. In this paper, we present a new approach to estimate the FIM in latent variable model. The advantage of our method is that only the first derivatives of the log-likelihood is needed, contrary to other approaches based on the observed FIM. Indeed we consider the empirical estimate of the covariance matrix of the score. We prove that this estimate of the Fisher information matrix is unbiased, consistent and asymptotically Gaussian. Moreover we highlight that none of both estimates is better than the other in terms of asymptotic covariance matrix. When the proposed estimate can not be directly analytically evaluated, we present a stochastic approximation estimation algorithm to compute it. This algorithm provides this estimate of the FIM as a by-product of the parameter estimates. We emphasize that the proposed algorithm only requires to compute the first derivatives of the complete data log-likelihood with respect to the parameters. We prove that the estimation algorithm is consistent and asymptotically Gaussian when the number of iterations goes to infinity. We evaluate the finite sample size properties of the proposed estimate and of the observed FIM through simulation studies in linear mixed effects models and mixture models. We also investigate the convergence properties of the estimation algorithm in non linear mixed effects models. We compare the performances of the proposed algorithm to those of other existing methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/15/2023

How to estimate Fisher information matrices from simulations

The Fisher information matrix is a quantity of fundamental importance fo...
research
06/01/2023

R-VGAL: A Sequential Variational Bayes Algorithm for Generalised Linear Mixed Models

Models with random effects, such as generalised linear mixed models (GLM...
research
01/10/2013

Cross-covariance modelling via DAGs with hidden variables

DAG models with hidden variables present many difficulties that are not ...
research
05/15/2012

Asymptotic Accuracy of Bayes Estimation for Latent Variables with Redundancy

Hierarchical parametric models consisting of observable and latent varia...
research
11/02/2019

Rank-deficiencies in a reduced information latent variable model

Latent variable models are well-known to suffer from rank deficiencies, ...
research
02/09/2021

Fisher Scoring for crossed factor Linear Mixed Models

The analysis of longitudinal, heterogeneous or unbalanced clustered data...

Please sign up or login with your details

Forgot password? Click here to reset