Log In Sign Up

Applicability of Random Matrix Theory in Deep Learning

by   Nicholas P. Baskerville, et al.

We investigate the local spectral statistics of the loss surface Hessians of artificial neural networks, where we discover excellent agreement with Gaussian Orthogonal Ensemble statistics across several network architectures and datasets. These results shed new light on the applicability of Random Matrix Theory to modelling neural networks and suggest a previously unrecognised role for it in the study of loss surfaces in deep learning. Inspired by these observations, we propose a novel model for the true loss surfaces of neural networks, consistent with our observations, which allows for Hessian spectral densities with rank degeneracy and outliers, extensively observed in practice, and predicts a growing independence of loss gradients as a function of distance in weight-space. We further investigate the importance of the true loss surface in neural networks and find, in contrast to previous work, that the exponential hardness of locating the global minimum has practical consequences for achieving state of the art performance.


page 2

page 7


Universal characteristics of deep neural network loss surfaces from random matrix theory

This paper considers several aspects of random matrix universality in de...

Beyond Random Matrix Theory for Deep Networks

We investigate whether the Wigner semi-circle and Marcenko-Pastur distri...

Spectral Ergodicity in Deep Learning Architectures via Surrogate Random Matrices

In this work a novel method to quantify spectral ergodicity for random m...

On the Convex Behavior of Deep Neural Networks in Relation to the Layers' Width

The Hessian of neural networks can be decomposed into a sum of two matri...

Principled Deep Neural Network Training through Linear Programming

Deep Learning has received significant attention due to its impressive p...

Expressive Power and Loss Surfaces of Deep Learning Models

The goals of this paper are two-fold. The first goal is to serve as an e...