Heavy-Tailed Regularization of Weight Matrices in Deep Neural Networks

04/06/2023
by   Xuanzhe Xiao, et al.
0

Unraveling the reasons behind the remarkable success and exceptional generalization capabilities of deep neural networks presents a formidable challenge. Recent insights from random matrix theory, specifically those concerning the spectral analysis of weight matrices in deep neural networks, offer valuable clues to address this issue. A key finding indicates that the generalization performance of a neural network is associated with the degree of heavy tails in the spectrum of its weight matrices. To capitalize on this discovery, we introduce a novel regularization technique, termed Heavy-Tailed Regularization, which explicitly promotes a more heavy-tailed spectrum in the weight matrix through regularization. Firstly, we employ the Weighted Alpha and Stable Rank as penalty terms, both of which are differentiable, enabling the direct calculation of their gradients. To circumvent over-regularization, we introduce two variations of the penalty function. Then, adopting a Bayesian statistics perspective and leveraging knowledge from random matrices, we develop two novel heavy-tailed regularization methods, utilizing Powerlaw distribution and Frechet distribution as priors for the global spectrum and maximum eigenvalues, respectively. We empirically show that heavytailed regularization outperforms conventional regularization techniques in terms of generalization performance.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/23/2021

Compressing Heavy-Tailed Weight Matrices for Non-Vacuous Generalization Bounds

Heavy-tailed distributions have been studied in statistics, random matri...
research
11/26/2021

Implicit Data-Driven Regularization in Deep Neural Networks under SGD

Much research effort has been devoted to explaining the success of deep ...
research
01/24/2019

Traditional and Heavy-Tailed Self Regularization in Neural Network Models

Random Matrix Theory (RMT) is applied to analyze the weight matrices of ...
research
10/02/2018

Implicit Self-Regularization in Deep Neural Networks: Evidence from Random Matrix Theory and Implications for Learning

Random Matrix Theory (RMT) is applied to analyze weight matrices of Deep...
research
01/24/2019

Heavy-Tailed Universality Predicts Trends in Test Accuracies for Very Large Pre-Trained Deep Neural Networks

Given two or more Deep Neural Networks (DNNs) with the same or similar a...
research
03/01/2021

Local Tail Statistics of Heavy-Tailed Random Matrix Ensembles with Unitary Invariance

We study heavy-tailed Hermitian random matrices that are unitarily invar...
research
04/06/2023

Spectral Gap Regularization of Neural Networks

We introduce Fiedler regularization, a novel approach for regularizing n...

Please sign up or login with your details

Forgot password? Click here to reset