DeepAI
Log In Sign Up

Applicability of Random Matrix Theory in Deep Learning

02/12/2021
by   Nicholas P. Baskerville, et al.
0

We investigate the local spectral statistics of the loss surface Hessians of artificial neural networks, where we discover excellent agreement with Gaussian Orthogonal Ensemble statistics across several network architectures and datasets. These results shed new light on the applicability of Random Matrix Theory to modelling neural networks and suggest a previously unrecognised role for it in the study of loss surfaces in deep learning. Inspired by these observations, we propose a novel model for the true loss surfaces of neural networks, consistent with our observations, which allows for Hessian spectral densities with rank degeneracy and outliers, extensively observed in practice, and predicts a growing independence of loss gradients as a function of distance in weight-space. We further investigate the importance of the true loss surface in neural networks and find, in contrast to previous work, that the exponential hardness of locating the global minimum has practical consequences for achieving state of the art performance.

READ FULL TEXT

page 2

page 7

05/17/2022

Universal characteristics of deep neural network loss surfaces from random matrix theory

This paper considers several aspects of random matrix universality in de...
06/13/2020

Beyond Random Matrix Theory for Deep Networks

We investigate whether the Wigner semi-circle and Marcenko-Pastur distri...
04/25/2017

Spectral Ergodicity in Deep Learning Architectures via Surrogate Random Matrices

In this work a novel method to quantify spectral ergodicity for random m...
01/14/2020

On the Convex Behavior of Deep Neural Networks in Relation to the Layers' Width

The Hessian of neural networks can be decomposed into a sum of two matri...
10/07/2018

Principled Deep Neural Network Training through Linear Programming

Deep Learning has received significant attention due to its impressive p...
08/08/2021

Expressive Power and Loss Surfaces of Deep Learning Models

The goals of this paper are two-fold. The first goal is to serve as an e...