Concentration inequalities of MLE and robust MLE

10/17/2022
by   Xiaowei Yang, et al.
0

The Maximum Likelihood Estimator (MLE) serves an important role in statistics and machine learning. In this article, for i.i.d. variables, we obtain constant-specified and sharp concentration inequalities and oracle inequalities for the MLE only under exponential moment conditions. Furthermore, in a robust setting, the sub-Gaussian type oracle inequalities of the log-truncated maximum likelihood estimator are derived under the second-moment condition.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/11/2021

Some Hoeffding- and Bernstein-type Concentration Inequalities

We prove concentration inequalities for functions of independent random ...
research
10/04/2019

Introduction to Concentration Inequalities

In this report, we aim to exemplify concentration inequalities and provi...
research
02/27/2018

Sharp oracle inequalities for stationary points of nonconvex penalized M-estimators

Many statistical estimation procedures lead to nonconvex optimization pr...
research
04/09/2017

Distributed Statistical Estimation and Rates of Convergence in Normal Approximation

This paper presents new algorithms for distributed statistical estimatio...
research
07/19/2018

Sparse space-time models: Concentration Inequalities and Lasso

Inspired by Kalikow-type decompositions, we introduce a new stochastic m...
research
03/13/2023

Tight Non-asymptotic Inference via Sub-Gaussian Intrinsic Moment Norm

In non-asymptotic statistical inferences, variance-type parameters of su...

Please sign up or login with your details

Forgot password? Click here to reset