Network Degeneracy as an Indicator of Training Performance: Comparing Finite and Infinite Width Angle Predictions

06/02/2023
by   Cameron Jakub, et al.
0

Neural networks are powerful functions with widespread use, but the theoretical behaviour of these functions is not fully understood. Creating deep neural networks by stacking many layers has achieved exceptional performance in many applications and contributed to the recent explosion of these methods. Previous works have shown that depth can exponentially increase the expressibility of the network. However, as networks get deeper and deeper, they are more susceptible to becoming degenerate. We observe this degeneracy in the sense that on initialization, inputs tend to become more and more correlated as they travel through the layers of the network. If a network has too many layers, it tends to approximate a (random) constant function, making it effectively incapable of distinguishing between inputs. This seems to affect the training of the network and cause it to perform poorly, as we empirically investigate in this paper. We use a simple algorithm that can accurately predict the level of degeneracy for any given fully connected ReLU network architecture, and demonstrate how the predicted degeneracy relates to training dynamics of the network. We also compare this prediction to predictions derived using infinite width networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/20/2023

Depth Degeneracy in Neural Networks: Vanishing Angles in Fully Connected ReLU Networks on Initialization

Stacking many layers to create truly deep neural networks is arguably wh...
research
06/07/2021

The Future is Log-Gaussian: ResNets and Their Infinite-Depth-and-Width Limit at Initialization

Theoretical results show that neural networks can be approximated by Gau...
research
02/01/2022

Neural Tangent Kernel Beyond the Infinite-Width Limit: Effects of Depth and Initialization

Neural Tangent Kernel (NTK) is widely used to analyze overparametrized n...
research
05/26/2020

Is deeper better? It depends on locality of relevant features

It has been recognized that a heavily overparameterized artificial neura...
research
04/03/2022

Correlation Functions in Random Fully Connected Neural Networks at Finite Width

This article considers fully connected neural networks with Gaussian ran...
research
05/19/2022

Neural Network Architecture Beyond Width and Depth

This paper proposes a new neural network architecture by introducing an ...
research
07/02/2021

Subspace Clustering Based Analysis of Neural Networks

Tools to analyze the latent space of deep neural networks provide a step...

Please sign up or login with your details

Forgot password? Click here to reset