Entropy-based test for generalized Gaussian distributions

10/13/2020
by   Mehmet Siddik Cadirci, et al.
1

In this paper, we provide the proof of L^2 consistency for the kth nearest neighbour distance estimator of the Shannon entropy for an arbitrary fixed k≥ 1. We construct the non-parametric test of goodness-of-fit for a class of introduced generalized multivariate Gaussian distributions based on a maximum entropy principle. The theoretical results are followed by numerical studies on simulated samples.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

10/21/2020

The entropy based goodness of fit tests for generalized von Mises-Fisher distributions and beyond

We introduce some new classes of unimodal rotational invariant direction...
05/02/2022

Asymptotic Normality for Plug-in Estimators of Generalized Shannon's Entropy

Shannon's entropy is one of the building blocks of information theory an...
06/01/2021

Statistical tests based on Rényi entropy estimation

Entropy and its various generalizations are important in many fields, in...
01/29/2019

Renyi and Shannon Entropies of Finite Mixtures of Multivariate Skew t-distributions

Shannon and Renyi entropies are quantitative measures of uncertainty in ...
11/18/2021

On Generalized Schürmann Entropy Estimators

We present a new class of estimators of Shannon entropy for severely und...
01/16/2013

Maximum Entropy and the Glasses You Are Looking Through

We give an interpretation of the Maximum Entropy (MaxEnt) Principle in g...
02/23/2021

A robust multivariate linear non-parametric maximum likelihood model for ties

Statistical analysis in applied research, across almost every field (e.g...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.