Objective Bayesian Analysis for the Differential Entropy of the Gamma Distribution

12/28/2020
by   Eduardo Ramos, et al.
0

The use of entropy related concepts goes from physics, such as in statistical mechanics, to evolutionary biology. The Shannon entropy is a measure used to quantify the amount of information in a system, and its estimation is usually made under the frequentist approach. In the present paper, we introduce an fully objective Bayesian analysis to obtain this measure's posterior distribution. Notably, we consider the Gamma distribution, which describes many natural phenomena in physics, engineering, and biology. We reparametrize the model in terms of entropy, and different objective priors are derived, such as Jeffreys prior, reference prior, and matching priors. Since the obtained priors are improper, we prove that the obtained posterior distributions are proper and their respective posterior means are finite. An intensive simulation study is conducted to select the prior that returns better results in terms of bias, mean square error, and coverage probabilities. The proposed approach is illustrated in two datasets, where the first one is related to the Achaemenid dynasty reign period, and the second data describes the time to failure of an electronic component in the sugarcane harvest machine.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/09/2020

Objective Bayesian analysis for spatial Student-t regression models

The choice of the prior distribution is a key aspect of Bayesian analysi...
research
05/15/2020

Power laws distributions in objective priors

The use of objective prior in Bayesian applications has become a common ...
research
05/17/2020

Posterior properties of the Weibull distribution for censored data

The Weibull distribution is one of the most used tools in reliability an...
research
06/26/2023

A Note on Bayesian Inference for the Bivariate Pseudo-Exponential Data

In this present work, we discuss the Bayesian inference for the bivariat...
research
11/09/2019

Estimation of entropy measures for categorical variables with spatial correlation

Entropy is a measure of heterogeneity widely used in applied sciences, o...
research
07/09/2021

Entropy, Information, and the Updating of Probabilities

This paper is a review of a particular approach to the method of maximum...
research
07/30/2018

A Proof of Entropy Minimization for Outputs in Deletion Channels via Hidden Word Statistics

From the output produced by a memoryless deletion channel from a uniform...

Please sign up or login with your details

Forgot password? Click here to reset