Generalized Bayesian Cramér-Rao Inequality via Information Geometry of Relative α-Entropy

02/11/2020
by   Kumar Vijay Mishra, et al.
0

The relative α-entropy is the Rényi analog of relative entropy and arises prominently in information-theoretic problems. Recent information geometric investigations on this quantity have enabled the generalization of the Cramér-Rao inequality, which provides a lower bound for the variance of an estimator of an escort of the underlying parametric probability distribution. However, this framework remains unexamined in the Bayesian framework. In this paper, we propose a general Riemannian metric based on relative α-entropy to obtain a generalized Bayesian Cramér-Rao inequality. This establishes a lower bound for the variance of an unbiased estimator for the α-escort distribution starting from an unbiased estimator for the underlying distribution. We show that in the limiting case when the entropy order approaches unity, this framework reduces to the conventional Bayesian Cramér-Rao inequality. Further, in the absence of priors, the same framework yields the deterministic Cramér-Rao inequality.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/14/2020

Cramér-Rao Lower Bounds Arising from Generalized Csiszár Divergences

We study the geometry of probability distributions with respect to a gen...
research
07/10/2019

Entropy and Compression: A simple proof of an inequality of Khinchin

We prove that Entropy is a lower bound for the average compression ratio...
research
04/02/2021

Hybrid and Generalized Bayesian Cramér-Rao Inequalities via Information Geometry

Information geometry is the study of statistical models from a Riemannia...
research
10/14/2022

On Triangular Inequality of the Discounted Least Information Theory of Entropy (DLITE)

The Discounted Least Information Theory of Entropy (DLITE) is a new info...
research
06/30/2017

Barankin Vector Locally Best Unbiased Estimates

The Barankin bound is generalized to the vector case in the mean square ...
research
01/15/2018

Information Geometric Approach to Bayesian Lower Error Bounds

Information geometry describes a framework where probability densities c...
research
02/22/2019

A Family of Bayesian Cramér-Rao Bounds, and Consequences for Log-Concave Priors

Under minimal regularity assumptions, we establish a family of informati...

Please sign up or login with your details

Forgot password? Click here to reset