Adaptive Estimators Show Information Compression in Deep Neural Networks

02/24/2019
by   Ivan Chelombiev, et al.
0

To improve how neural networks function it is crucial to understand their learning process. The information bottleneck theory of deep learning proposes that neural networks achieve good generalization by compressing their representations to disregard information that is not relevant to the task. However, empirical evidence for this theory is conflicting, as compression was only observed when networks used saturating activation functions. In contrast, networks with non-saturating activation functions achieved comparable levels of task performance but did not show compression. In this paper we developed more robust mutual information estimation techniques, that adapt to hidden activity of neural networks and produce more sensitive measurements of activations from all functions, especially unbounded functions. Using these adaptive estimation techniques, we explored compression in networks with a range of different activation functions. With two improved methods of estimation, firstly, we show that saturation of the activation function is not required for compression, and the amount of compression varies between different activation functions. We also find that there is a large amount of variation in compression between different network initializations. Secondary, we see that L2 regularization leads to significantly increased compression, while preventing overfitting. Finally, we show that only compression of the last layer is positively correlated with generalization.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/19/2023

Complexity of Neural Network Training and ETR: Extensions with Effectively Continuous Functions

We study the complexity of the problem of training neural networks defin...
research
06/13/2020

Understanding Learning Dynamics of Binary Neural Networks via Information Bottleneck

Compact neural networks are essential for affordable and power efficient...
research
11/03/2020

Analytical aspects of non-differentiable neural networks

Research in computational deep learning has directed considerable effort...
research
05/19/2023

Justices for Information Bottleneck Theory

This study comes as a timely response to mounting criticism of the infor...
research
08/02/2022

Lossy compression of multidimensional medical images using sinusoidal activation networks: an evaluation study

In this work, we evaluate how neural networks with periodic activation f...
research
03/28/2023

Function Approximation with Randomly Initialized Neural Networks for Approximate Model Reference Adaptive Control

Classical results in neural network approximation theory show how arbitr...
research
02/15/2021

Low Curvature Activations Reduce Overfitting in Adversarial Training

Adversarial training is one of the most effective defenses against adver...

Please sign up or login with your details

Forgot password? Click here to reset