Entropy-based test for generalized Gaussian distributions

10/13/2020
by   Mehmet Siddik Cadirci, et al.
1

In this paper, we provide the proof of L^2 consistency for the kth nearest neighbour distance estimator of the Shannon entropy for an arbitrary fixed k≥ 1. We construct the non-parametric test of goodness-of-fit for a class of introduced generalized multivariate Gaussian distributions based on a maximum entropy principle. The theoretical results are followed by numerical studies on simulated samples.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/21/2020

The entropy based goodness of fit tests for generalized von Mises-Fisher distributions and beyond

We introduce some new classes of unimodal rotational invariant direction...
research
05/02/2022

Asymptotic Normality for Plug-in Estimators of Generalized Shannon's Entropy

Shannon's entropy is one of the building blocks of information theory an...
research
06/01/2021

Statistical tests based on Rényi entropy estimation

Entropy and its various generalizations are important in many fields, in...
research
01/29/2019

Renyi and Shannon Entropies of Finite Mixtures of Multivariate Skew t-distributions

Shannon and Renyi entropies are quantitative measures of uncertainty in ...
research
11/18/2021

On Generalized Schürmann Entropy Estimators

We present a new class of estimators of Shannon entropy for severely und...
research
07/03/2019

A Bayesian Semiparametric Gaussian Copula Approach to a Multivariate Normality Test

In this paper, a Bayesian semiparametric copula approach is used to mode...
research
07/30/2021

Representing Pareto optima in preordered spaces: from Shannon entropy to injective monotones

Shannon entropy is the most widely used measure of uncertainty. It is us...

Please sign up or login with your details

Forgot password? Click here to reset