Log In Sign Up

Exact Spectral Norm Regularization for Neural Networks

by   Anton Johansson, et al.

We pursue a line of research that seeks to regularize the spectral norm of the Jacobian of the input-output mapping for deep neural networks. While previous work rely on upper bounding techniques, we provide a scheme that targets the exact spectral norm. We showcase that our algorithm achieves an improved generalization performance compared to previous spectral regularization techniques while simultaneously maintaining a strong safeguard against natural and adversarial noise. Moreover, we further explore some previous reasoning concerning the strong adversarial protection that Jacobian regularization provides and show that it can be misleading.


page 1

page 2

page 3

page 4


Adversarial Training Generalizes Data-dependent Spectral Norm Regularization

We establish a theoretical link between adversarial training and operato...

Spectral Norm Regularization for Improving the Generalizability of Deep Learning

We investigate the generalizability of deep learning based on the sensit...

Bounding Singular Values of Convolution Layers

In deep neural networks, the spectral norm of the Jacobian of a layer bo...

On Regularization and Robustness of Deep Neural Networks

Despite their success, deep neural networks suffer from several drawback...

Input Hessian Regularization of Neural Networks

Regularizing the input gradient has shown to be effective in promoting t...

Fast Approximate Spectral Normalization for Robust Deep Neural Networks

Deep neural networks (DNNs) play an important role in machine learning d...

Spectral Regularization: an Inductive Bias for Sequence Modeling

Various forms of regularization in learning tasks strive for different n...