Minimum divergence estimators, Maximum Likelihood and the generalized bootstrap

11/03/2020
by   Michel Broniatowski, et al.
0

This paper is an attempt to set a justification for making use of some dicrepancy indexes, starting from the classical Maximum Likelihood definition, and adapting the corresponding basic principle of inference to situations where minimization of those indexes between a model and some extension of the empirical measure of the data appears as its natural extension. This leads to the so called generalized bootstrap setting for which minimum divergence inference seems to replace Maximum Likelihood one. 1 Motivation and context Divergences between probability measures are widely used in Statistics and Data Science in order to perform inference under models of various kinds, paramet-ric or semi parametric, or even in non parametric settings. The corresponding methods extend the likelihood paradigm and insert inference in some minimum "distance" framing, which provides a convenient description for the properties of the resulting estimators and tests, under the model or under misspecifica-tion. Furthermore they pave the way to a large number of competitive methods , which allows for trade-off between efficiency and robustness, among others. Many families of such divergences have been proposed, some of them stemming from classical statistics (such as the Chi-square), while others have their origin in other fields such as Information theory. Some measures of discrepancy involve regularity of the corresponding probability measures while others seem to be restricted to measures on finite or countable spaces, at least when using them as inferential tools, henceforth in situations when the elements of a model have to be confronted with a dataset. The choice of a specific discrepancy measure in specific context is somehow arbitrary in many cases, although the resulting conclusion of the inference might differ accordingly, above all under misspecification; however the need for such approaches is clear when aiming at robustness.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/15/2023

Asymptotic Breakdown Point Analysis for a General Class of Minimum Divergence Estimators

Robust inference based on the minimization of statistical divergences ha...
research
10/12/2020

Robust Estimation under Linear Mixed Models: The Minimum Density Power Divergence Approach

Many real-life data sets can be analyzed using Linear Mixed Models (LMMs...
research
04/11/2022

Robust and Efficient Parameter Estimation for Discretely Observed Stochastic Processes

In various practical situations, we encounter data from stochastic proce...
research
11/28/2021

An inverse Sanov theorem for curved exponential families

We prove the large deviation principle (LDP) for posterior distributions...
research
12/16/2015

A Novel Minimum Divergence Approach to Robust Speaker Identification

In this work, a novel solution to the speaker identification problem is ...
research
08/12/2019

Elements of asymptotic theory with outer probability measures

Outer measures can be used for statistical inference in place of probabi...
research
12/17/2020

Maximum Entropy competes with Maximum Likelihood

Maximum entropy (MAXENT) method has a large number of applications in th...

Please sign up or login with your details

Forgot password? Click here to reset