Towards quantifying information flows: relative entropy in deep neural networks and the renormalization group

07/14/2021
by   Johanna Erdmenger, et al.
0

We investigate the analogy between the renormalization group (RG) and deep neural networks, wherein subsequent layers of neurons are analogous to successive steps along the RG. In particular, we quantify the flow of information by explicitly computing the relative entropy or Kullback-Leibler divergence in both the one- and two-dimensional Ising models under decimation RG, as well as in a feedforward neural network as a function of depth. We observe qualitatively identical behavior characterized by the monotonic increase to a parameter-dependent asymptotic value. On the quantum field theory side, the monotonic increase confirms the connection between the relative entropy and the c-theorem. For the neural networks, the asymptotic behavior may have implications for various information maximization methods in machine learning, as well as for disentangling compactness and generalizability. Furthermore, while both the two-dimensional Ising model and the random neural networks we consider exhibit non-trivial critical points, the relative entropy appears insensitive to the phase structure of either system. In this sense, more refined probes are required in order to fully elucidate the flow of information in these models.

READ FULL TEXT
research
05/10/2021

Towards a functorial description of quantum relative entropy

A Bayesian functorial characterization of the classical relative entropy...
research
02/09/2023

Lower semicontinuity of the relative entropy disturbance and its corollaries

It is proved that the decrease of the quantum relative entropy under act...
research
02/15/2022

Generalisation and the Risk–Entropy Curve

In this paper we show that the expected generalisation performance of a ...
research
09/27/2021

The edge of chaos: quantum field theory and deep neural networks

We explicitly construct the quantum field theory corresponding to a gene...
research
05/10/2023

Entropy Functions on Two-Dimensional Faces of Polymatroidal Region of Degree Four

In this paper, we characterize entropy functions on the 2-dimensional fa...
research
06/19/2020

An Ode to an ODE

We present a new paradigm for Neural ODE algorithms, calledODEtoODE, whe...
research
04/22/2021

Semiotic Aggregation in Deep Learning

Convolutional neural networks utilize a hierarchy of neural network laye...

Please sign up or login with your details

Forgot password? Click here to reset