Cramér-Rao-type Bound and Stam's Inequality for Discrete Random Variables

04/29/2019
by   Tomohiro Nishiyama, et al.
0

The variance and the entropy power of a continuous random variable are bounded from below by the reciprocal of its Fisher information through the Cramér-Rao bound and the Stam's inequality respectively. In this note, we introduce the Fisher information for discrete random variables and derive the discrete Cramér-Rao-type bound and the discrete Stam's inequality.

READ FULL TEXT
research
05/08/2019

An Entropy Power Inequality for Discrete Random Variables

Let N_ d[X]=1/2π e e^2H[X] denote the entropy power of the discrete rand...
research
04/29/2019

Properties of discrete Fisher information: Cramer-Rao-type and log-Sobolev-type inequalities

The Fisher information have connections with the standard deviation and ...
research
09/25/2018

Some Characterizations and Properties of COM-Poisson Random Variables

This paper introduces some new characterizations of COM-Poisson random v...
research
04/16/2018

A refinement of Bennett's inequality with applications to portfolio optimization

A refinement of Bennett's inequality is introduced which is strictly tig...
research
01/12/2021

A note on a confidence bound of Kuzborskij and Szepesvári

In an interesting recent work, Kuzborskij and Szepesvári derived a confi...
research
01/12/2020

On the Compressibility of Affinely Singular Random Vectors

There are several ways to measure the compressibility of a random measur...
research
09/01/2022

Testing for the Important Components of Posterior Predictive Variance

We give a decomposition of the posterior predictive variance using the l...

Please sign up or login with your details

Forgot password? Click here to reset