Strongly Consistent of Kullback-Leibler Divergence Estimator and Tests for Model Selection Based on a Bias Reduced Kernel Density Estimator

05/18/2018
by   Papa Ngom, et al.
0

In this paper, we study the strong consistency of a bias reduced kernel density estimator and derive a strongly con- sistent Kullback-Leibler divergence (KLD) estimator. As application, we formulate a goodness-of-fit test and an asymptotically standard normal test for model selection. The Monte Carlo simulation show the effectiveness of the proposed estimation methods and statistical tests.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/15/2018

Model Selection via the VC-Dimension

We derive an objective function that can be optimized to give an estimat...
research
09/13/2019

Some improvement on non-parametric estimation of income distribution and poverty index

In this paper, we propose an estimator of Foster, Greer and Thorbecke cl...
research
03/08/2019

Kernel Based Estimation of Spectral Risk Measures

Spectral risk measures (SRMs) belongs to the family of coherent risk mea...
research
02/09/2016

A Kernel Test of Goodness of Fit

We propose a nonparametric statistical test for goodness-of-fit: given a...
research
05/28/2020

Boundary-free Kernel-smoothed Goodness-of-fit Tests for Data on General Interval

We propose kernel-type smoothed Kolmogorov-Smirnov and Cramér-von Mises ...
research
06/18/2012

Tighter Variational Representations of f-Divergences via Restriction to Probability Measures

We show that the variational representations for f-divergences currently...
research
09/27/2022

Monte-Carlo Sampling Approach to Model Selection: A Primer

Any data modeling exercise has two main components: parameter estimation...

Please sign up or login with your details

Forgot password? Click here to reset