Unveiling the Hessian's Connection to the Decision Boundary

06/12/2023
by   Mahalakshmi Sabanayagam, et al.
0

Understanding the properties of well-generalizing minima is at the heart of deep learning research. On the one hand, the generalization of neural networks has been connected to the decision boundary complexity, which is hard to study in the high-dimensional input space. Conversely, the flatness of a minimum has become a controversial proxy for generalization. In this work, we provide the missing link between the two approaches and show that the Hessian top eigenvectors characterize the decision boundary learned by the neural network. Notably, the number of outliers in the Hessian spectrum is proportional to the complexity of the decision boundary. Based on this finding, we provide a new and straightforward approach to studying the complexity of a high-dimensional decision boundary; show that this connection naturally inspires a new generalization measure; and finally, we develop a novel margin estimation technique which, in combination with the generalization measure, precisely identifies minima with simple wide-margin boundaries. Overall, this analysis establishes the connection between the Hessian and the decision boundary and provides a new method to identify minima with simple wide-margin decision boundaries.

READ FULL TEXT

page 16

page 25

research
12/22/2020

Fractal Dimension Generalization Measure

Developing a robust generalization measure for the performance of machin...
research
09/16/2020

Analysis of Generalizability of Deep Neural Networks Based on the Complexity of Decision Boundary

For supervised learning models, the analysis of generalization ability (...
research
11/25/2022

The Vanishing Decision Boundary Complexity and the Strong First Component

We show that unlike machine learning classifiers, there are no complex b...
research
08/29/2023

Input margins can predict generalization too

Understanding generalization in deep neural networks is an active area o...
research
07/21/2021

Analytic Study of Families of Spurious Minima in Two-Layer ReLU Neural Networks

We study the optimization problem associated with fitting two-layer ReLU...
research
08/28/2022

Visualizing high-dimensional loss landscapes with Hessian directions

Analyzing geometric properties of high-dimensional loss functions, such ...
research
01/15/2021

Heating up decision boundaries: isocapacitory saturation, adversarial scenarios and generalization bounds

In the present work we study classifiers' decision boundaries via Browni...

Please sign up or login with your details

Forgot password? Click here to reset