Understanding Neural Networks with Logarithm Determinant Entropy Estimator

05/08/2021
by   Zhanghao Zhouyin, et al.
0

Understanding the informative behaviour of deep neural networks is challenged by misused estimators and the complexity of network structure, which leads to inconsistent observations and diversified interpretation. Here we propose the LogDet estimator – a reliable matrix-based entropy estimator that approximates Shannon differential entropy. We construct informative measurements based on LogDet estimator, verify our method with comparable experiments and utilize it to analyse neural network behaviour. Our results demonstrate the LogDet estimator overcomes the drawbacks that emerge from highly diverse and degenerated distribution thus is reliable to estimate entropy in neural networks. The Network analysis results also find a functional distinction between shallow and deeper layers, which can help understand the compression phenomenon in the Information bottleneck theory of neural networks.

READ FULL TEXT

page 9

page 10

research
03/09/2020

Learning entropy production via neural networks

This paper presents a neural estimator for entropy production, or NEEP, ...
research
10/27/2018

Estimating Differential Entropy under Gaussian Convolutions

This paper studies the problem of estimating the differential entropy h(...
research
09/08/2020

Shannon entropy estimation for linear processes

In this paper, we estimate the Shannon entropy S(f) = -[ log (f(x))] of ...
research
09/21/2020

On the Efficient Estimation of Min-Entropy

The min-entropy is an important metric to quantify randomness of generat...
research
02/15/2022

Generalisation and the Risk–Entropy Curve

In this paper we show that the expected generalisation performance of a ...
research
02/14/2022

KNIFE: Kernelized-Neural Differential Entropy Estimation

Mutual Information (MI) has been widely used as a loss regularizer for t...
research
07/26/2017

Fast calculation of entropy with Zhang's estimator

Entropy is a fundamental property of a repertoire. Here, we present an e...

Please sign up or login with your details

Forgot password? Click here to reset