Robust Estimation under Linear Mixed Models: The Minimum Density Power Divergence Approach

10/12/2020
by   Giovanni Saraceno, et al.
0

Many real-life data sets can be analyzed using Linear Mixed Models (LMMs). Since these are ordinarily based on normality assumptions, under small deviations from the model the inference can be highly unstable when the associated parameters are estimated by classical methods. On the other hand, the density power divergence (DPD) family, which measures the discrepancy between two probability density functions, has been successfully used to build robust estimators with high stability associated with minimal loss in efficiency. Here, we develop the minimum DPD estimator (MDPDE) for independent but non identically distributed observations in LMMs. We prove the theoretical properties, including consistency and asymptotic normality. The influence function and sensitivity measures are studied to explore the robustness properties. As a data based choice of the MDPDE tuning parameter α is very important, we propose two candidates as "optimal" choices, where optimality is in the sense of choosing the strongest downweighting that is necessary for the particular data set. We conduct a simulation study comparing the proposed MDPDE, for different values of α, with the S-estimators, M-estimators and the classical maximum likelihood estimator, considering different levels of contamination. Finally, we illustrate the performance of our proposal on a real-data example.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/15/2023

Asymptotic Breakdown Point Analysis for a General Class of Minimum Divergence Estimators

Robust inference based on the minimization of statistical divergences ha...
research
02/24/2021

Estimation and testing on independent not identically distributed observations based on Rényi's pseudodistances

In real life we often deal with independent but not identically distribu...
research
08/30/2022

Robust and Efficient Estimation in Ordinal Response Models using the Density Power Divergence

In real life, we frequently come across data sets that involve some inde...
research
06/13/2020

γ-ABC: Outlier-Robust Approximate Bayesian Computation based on Robust Divergence Estimator

Making a reliable inference in complex models is an essential issue in s...
research
11/03/2020

Minimum divergence estimators, Maximum Likelihood and the generalized bootstrap

This paper is an attempt to set a justification for making use of some d...
research
10/27/2018

The B-Exponential Divergence and its Generalizations with Applications to Parametric Estimation

In this paper a new family of minimum divergence estimators based on the...
research
05/13/2021

Characterizing the Functional Density Power Divergence Class

The density power divergence (DPD) and related measures have produced ma...

Please sign up or login with your details

Forgot password? Click here to reset