DeepAI

# Properties of discrete Fisher information: Cramer-Rao-type and log-Sobolev-type inequalities

The Fisher information have connections with the standard deviation and the Shannon differential entropy through the Cramer-Rao bound and the log-Sobolev inequality. These inequalities hold for continuous distributions. In this paper, we introduce the Fisher information for discrete distributions (DFI) and show that the DFI satisfies the Cramer-Rao-type bound and the log-Sobolev-type inequality.

04/29/2019

### Cramér-Rao-type Bound and Stam's Inequality for Discrete Random Variables

The variance and the entropy power of a continuous random variable are b...
05/21/2020

### Reversals of Rényi Entropy Inequalities under Log-Concavity

We establish a discrete analog of the Rényi entropy comparison due to Bo...
05/27/2013

### On some interrelations of generalized q-entropies and a generalized Fisher information, including a Cramér-Rao inequality

In this communication, we describe some interrelations between generaliz...
01/21/2019

### Dual Loomis-Whitney inequalities via information theory

We establish lower bounds on the volume and the surface area of a geomet...
07/10/2018

### Understanding VAEs in Fisher-Shannon Plane

In information theory, Fisher information and Shannon information (entro...
01/31/2019

### The Relation Between Bayesian Fisher Information and Shannon Information for Detecting a Change in a Parameter

We derive a connection between performance of estimators the performance...
01/09/2018

### Generalized Fano-Type Inequality for Countably Infinite Systems with List-Decoding

This study investigates generalized Fano-type inequalities in the follow...