Stochastic Weighted Function Norm Regularization

10/18/2017
by   Amal Rannen Triki, et al.
0

Deep neural networks (DNNs) have become increasingly important due to their excellent empirical performance on a wide range of problems. However, regularization is generally achieved by indirect means, largely due to the complex set of functions defined by a network and the difficulty in measuring function complexity. There exists no method in the literature for additive regularization based on a norm of the function, as is classically considered in statistical learning theory. In this work, we propose sampling-based approximations to weighted function norms as regularizers for deep neural networks. We provide, to the best of our knowledge, the first proof in the literature of the NP-hardness of computing function norms of DNNs, motivating the necessity of a stochastic optimization strategy. Based on our proposed regularization scheme, stability-based bounds yield a O(N^-1/2) generalization error for our proposed regularizer when applied to convex function sets. We demonstrate broad conditions for the convergence of stochastic gradient descent on our objective, including for non-convex function sets such as those defined by DNNs. Finally, we empirically validate the improved performance of the proposed regularization strategy for both convex function sets as well as DNNs on real-world classification and segmentation tasks.

READ FULL TEXT

page 10

page 11

page 16

page 19

research
02/22/2022

Explicit Regularization via Regularizer Mirror Descent

Despite perfectly interpolating the training data, deep neural networks ...
research
08/23/2023

A multiobjective continuation method to compute the regularization path of deep neural networks

Sparsity is a highly desired feature in deep neural networks (DNNs) sinc...
research
01/18/2019

Foothill: A Quasiconvex Regularization Function

Deep neural networks (DNNs) have demonstrated success for many supervise...
research
01/04/2019

Transformed ℓ_1 Regularization for Learning Sparse Deep Neural Networks

Deep neural networks (DNNs) have achieved extraordinary success in numer...
research
02/03/2019

An Empirical Study on Regularization of Deep Neural Networks by Local Rademacher Complexity

Regularization of Deep Neural Networks (DNNs) for the sake of improving ...
research
10/22/2019

Explicitly Bayesian Regularizations in Deep Learning

Generalization is essential for deep learning. In contrast to previous w...
research
04/13/2023

Do deep neural networks have an inbuilt Occam's razor?

The remarkable performance of overparameterized deep neural networks (DN...

Please sign up or login with your details

Forgot password? Click here to reset