Capacity Control of ReLU Neural Networks by Basis-path Norm

09/19/2018
by   Shuxin Zheng, et al.
0

Recently, path norm was proposed as a new capacity measure for neural networks with Rectified Linear Unit (ReLU) activation function, which takes the rescaling-invariant property of ReLU into account. It has been shown that the generalization error bound in terms of the path norm explains the empirical generalization behaviors of the ReLU neural networks better than that of other capacity measures. Moreover, optimization algorithms which take path norm as the regularization term to the loss function, like Path-SGD, have been shown to achieve better generalization performance. However, the path norm counts the values of all paths, and hence the capacity measure based on path norm could be improperly influenced by the dependency among different paths. It is also known that each path of a ReLU network can be represented by a small group of linearly independent basis paths with multiplication and division operation, which indicates that the generalization behavior of the network only depends on only a few basis paths. Motivated by this, we propose a new norm Basis-path Norm based on a group of linearly independent paths to measure the capacity of neural networks more accurately. We establish a generalization error bound based on this basis path norm, and show it explains the generalization behaviors of ReLU networks more accurately than previous capacity measures via extensive experiments. In addition, we develop optimization algorithms which minimize the empirical risk regularized by the basis-path norm. Our experiments on benchmark datasets demonstrate that the proposed regularization method achieves clearly better performance on the test set than the previous regularization approaches.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/06/2019

Positively Scale-Invariant Flatness of ReLU Neural Networks

It was empirically confirmed by Keskar et al.SharpMinima that flatter mi...
research
05/30/2018

Towards Understanding the Role of Over-Parametrization in Generalization of Neural Networks

Despite existing work on ensuring generalization of neural networks in t...
research
10/18/2019

Interpreting Basis Path Set in Neural Networks

Based on basis path set, G-SGD algorithm significantly outperforms conve...
research
02/16/2022

On Measuring Excess Capacity in Neural Networks

We study the excess capacity of deep networks in the context of supervis...
research
10/22/2019

Global Capacity Measures for Deep ReLU Networks via Path Sampling

Classical results on the statistical complexity of linear models have co...
research
08/31/2020

Extreme Memorization via Scale of Initialization

We construct an experimental setup in which changing the scale of initia...
research
03/06/2019

A Priori Estimates of the Population Risk for Residual Networks

Optimal a priori estimates are derived for the population risk of a regu...

Please sign up or login with your details

Forgot password? Click here to reset