Dynamically Stable Infinite-Width Limits of Neural Classifiers

06/11/2020
by   Eugene. A. Golikov, et al.
0

Recent research has been focused on two different approaches to studying neural networks training in the limit of infinite width (1) a mean-field (MF) and (2) a constant neural tangent kernel (NTK) approximations. These two approaches have different scaling of hyperparameters with a width of a network layer and as a result different infinite width limit models. We propose a general framework to study how the limit behavior of neural models depends on the scaling of hyperparameters with a network width. Our framework allows us to derive scaling for existing MF and NTK limits, as well as an uncountable number of other scalings that lead to a dynamically stable limit behavior of corresponding models. However, only a finite number of distinct limit models are induced by these scalings. Each distinct limit model corresponds to a unique combination of such properties as boundedness of logits and tangent kernels at initialization or stationarity of tangent kernels. Existing MF and NTK limit models, as well as one novel limit model, satisfy most of the properties demonstrated by finite-width models. We also propose a novel initialization-corrected mean-field limit that satisfies all properties noted above, and its corresponding model is a simple modification for a finite-width model. Source code to reproduce all the reported results is available on GitHub.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/12/2020

Towards a General Theory of Infinite-Width Limits of Neural Classifiers

Obtaining theoretical guarantees for neural networks training appears to...
research
10/10/2022

Meta-Principled Family of Hyperparameter Scaling Strategies

In this note, we first derive a one-parameter family of hyperparameter s...
research
10/29/2021

Training Integrable Parameterizations of Deep Neural Networks in the Infinite-Width Limit

To theoretically understand the behavior of trained deep neural networks...
research
06/18/2020

The Recurrent Neural Tangent Kernel

The study of deep networks (DNs) in the infinite-width limit, via the so...
research
01/21/2020

On the infinite width limit of neural networks with a standard parameterization

There are currently two parameterizations used to derive fixed kernels c...
research
06/30/2023

The Shaped Transformer: Attention Models in the Infinite Depth-and-Width Limit

In deep learning theory, the covariance matrix of the representations se...
research
08/03/2021

Nonperturbative renormalization for the neural network-QFT correspondence

In a recent work arXiv:2008.08601, Halverson, Maiti and Stoner proposed ...

Please sign up or login with your details

Forgot password? Click here to reset