DeepAI AI Chat
Log In Sign Up

Properties of discrete Fisher information: Cramer-Rao-type and log-Sobolev-type inequalities

04/29/2019
by   Tomohiro Nishiyama, et al.
0

The Fisher information have connections with the standard deviation and the Shannon differential entropy through the Cramer-Rao bound and the log-Sobolev inequality. These inequalities hold for continuous distributions. In this paper, we introduce the Fisher information for discrete distributions (DFI) and show that the DFI satisfies the Cramer-Rao-type bound and the log-Sobolev-type inequality.

READ FULL TEXT

page 1

page 2

page 3

page 4

04/29/2019

Cramér-Rao-type Bound and Stam's Inequality for Discrete Random Variables

The variance and the entropy power of a continuous random variable are b...
05/21/2020

Reversals of Rényi Entropy Inequalities under Log-Concavity

We establish a discrete analog of the Rényi entropy comparison due to Bo...
01/21/2019

Dual Loomis-Whitney inequalities via information theory

We establish lower bounds on the volume and the surface area of a geomet...
07/10/2018

Understanding VAEs in Fisher-Shannon Plane

In information theory, Fisher information and Shannon information (entro...
01/31/2019

The Relation Between Bayesian Fisher Information and Shannon Information for Detecting a Change in a Parameter

We derive a connection between performance of estimators the performance...
01/09/2018

Generalized Fano-Type Inequality for Countably Infinite Systems with List-Decoding

This study investigates generalized Fano-type inequalities in the follow...