Estimating Differential Entropy under Gaussian Convolutions

10/27/2018
by   Ziv Goldfeld, et al.
0

This paper studies the problem of estimating the differential entropy h(S+Z), where S and Z are independent d-dimensional random variables with Z∼N(0,σ^2 I_d). The distribution of S is unknown, but n independently and identically distributed (i.i.d) samples from it are available. The question is whether having access to samples of S as opposed to samples of S+Z can improve estimation performance. We show that the answer is positive. More concretely, we first show that despite the regularizing effect of noise, the number of required samples still needs to scale exponentially in d. This result is proven via a random-coding argument that reduces the question to estimating the Shannon entropy on a 2^O(d)-sized alphabet. Next, for a fixed d and n→∞, it is shown that a simple plugin estimator, given by the differential entropy of the empirical distribution from S convolved with the Gaussian density, achieves the loss of O(( n)^d/4/√(n)). Note that the plugin estimator amounts here to the differential entropy of a d-dimensional Gaussian mixture, for which we propose an efficient Monte Carlo computation algorithm. At the same time, estimating h(S+Z) via generic differential entropy estimators applied to samples from S+Z would only attain much slower rates of order O(n^-1/d), despite the smoothness of P_S+Z. As an application, which was in fact our original motivation for the problem, we estimate information flows in deep neural networks and discuss Tishby's Information Bottleneck and the compression conjecture, among others.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/17/2019

Remarks on the Rényi Entropy of a sum of IID random variables

In this note we study a conjecture of Madiman and Wang which predicted t...
research
11/14/2019

Estimating differential entropy using recursive copula splitting

A method for estimating the Shannon differential entropy of multidimensi...
research
05/08/2021

Understanding Neural Networks with Logarithm Determinant Entropy Estimator

Understanding the informative behaviour of deep neural networks is chall...
research
05/23/2018

Determining the Number of Samples Required to Estimate Entropy in Natural Sequences

Calculating the Shannon entropy for symbolic sequences has been widely c...
research
04/19/2023

Entropy Estimation via Uniformization

Entropy estimation is of practical importance in information theory and ...
research
08/13/2018

On the Shannon entropy of the number of vertices with zero in-degree in randomly oriented hypergraphs

Suppose that you have n colours and m mutually independent dice, each of...
research
02/25/2021

On the consistency of the Kozachenko-Leonenko entropy estimate

We revisit the problem of the estimation of the differential entropy H(f...

Please sign up or login with your details

Forgot password? Click here to reset