Entropy of the Conditional Expectation under Gaussian Noise

06/08/2021
by   Arda Atalik, et al.
0

This paper considers an additive Gaussian noise channel with arbitrarily distributed finite variance input signals. It studies the differential entropy of the minimum mean-square error (MMSE) estimator and provides a new lower bound which connects the entropy of the input, output, and conditional mean. That is, the sum of entropies of the conditional mean and output is always greater than or equal to twice the input entropy. Various other properties such as upper bounds, asymptotics, Taylor series expansion, and connection to Fisher Information are obtained. An application of the lower bound in the remote-source coding problem is discussed, and extensions of the lower and upper bounds to the vector Gaussian channel are given.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/16/2018

Remote Source Coding under Gaussian Noise : Dueling Roles of Power and Entropy Power

The distributed remote source coding (so-called CEO) problem is studied ...
research
10/16/2020

On the MMSE Estimation of Norm of a Gaussian Vector under Additive White Gaussian Noise with Randomly Missing Input Entries

This paper considers the task of estimating the l_2 norm of a n-dimensio...
research
10/22/2018

On the Conditional Smooth Renyi Entropy and its Applications in Guessing and Source Coding

A novel definition of the conditional smooth Renyi entropy, which is dif...
research
06/05/2020

MMSE Bounds Under Kullback-Leibler Divergence Constraints on the Joint Input-Output Distribution

This paper proposes a new family of lower and upper bounds on the minimu...
research
05/07/2020

Nonparametric Estimation of the Fisher Information and Its Applications

This paper considers the problem of estimation of the Fisher information...
research
04/05/2021

A General Derivative Identity for the Conditional Mean Estimator in Gaussian Noise and Some Applications

Consider a channel Y= X+ N where X is an n-dimensional random vector, a...
research
02/02/2019

Complexity, Statistical Risk, and Metric Entropy of Deep Nets Using Total Path Variation

For any ReLU network there is a representation in which the sum of the a...

Please sign up or login with your details

Forgot password? Click here to reset