Strongly Consistent of Kullback-Leibler Divergence Estimator and Tests for Model Selection Based on a Bias Reduced Kernel Density Estimator
In this paper, we study the strong consistency of a bias reduced kernel density estimator and derive a strongly con- sistent Kullback-Leibler divergence (KLD) estimator. As application, we formulate a goodness-of-fit test and an asymptotically standard normal test for model selection. The Monte Carlo simulation show the effectiveness of the proposed estimation methods and statistical tests.
READ FULL TEXT