Sensitivity and Generalization in Neural Networks: an Empirical Study

02/23/2018
by   Roman Novak, et al.
0

In practice it is often found that large over-parameterized neural networks generalize better than their smaller counterparts, an observation that appears to conflict with classical notions of function complexity, which typically favor smaller models. In this work, we investigate this tension between complexity and generalization through an extensive empirical exploration of two natural metrics of complexity related to sensitivity to input perturbations. Our experiments survey thousands of models with various fully-connected architectures, optimizers, and other hyper-parameters, as well as four different image classification datasets. We find that trained neural networks are more robust to input perturbations in the vicinity of the training data manifold, as measured by the norm of the input-output Jacobian of the network, and that it correlates well with generalization. We further establish that factors associated with poor generalization - such as full-batch training or using random labels - correspond to lower robustness, while factors associated with good generalization - such as data augmentation and ReLU non-linearities - give rise to more robust functions. Finally, we demonstrate how the input-output Jacobian norm can be predictive of generalization at the level of individual test points.

READ FULL TEXT
research
07/05/2022

Predicting Out-of-Domain Generalization with Local Manifold Smoothness

Understanding how machine learning models generalize to new environments...
research
03/03/2021

Formalizing Generalization and Robustness of Neural Networks to Weight Perturbations

Studying the sensitivity of weight perturbation in neural networks and i...
research
10/18/2019

Towards Quantifying Intrinsic Generalization of Deep ReLU Networks

Understanding the underlying mechanisms that enable the empirical succes...
research
11/11/2019

An empirical study of the relation between network architecture and complexity

In this preregistration submission, we propose an empirical study of how...
research
06/11/2020

Tangent Space Sensitivity and Distribution of Linear Regions in ReLU Networks

Recent articles indicate that deep neural networks are efficient models ...
research
02/13/2019

Identity Crisis: Memorization and Generalization under Extreme Overparameterization

We study the interplay between memorization and generalization of overpa...
research
04/12/2021

Generalization bounds via distillation

This paper theoretically investigates the following empirical phenomenon...

Please sign up or login with your details

Forgot password? Click here to reset