Bounding Singular Values of Convolution Layers

11/22/2019
by   Sahil Singla, et al.
0

In deep neural networks, the spectral norm of the Jacobian of a layer bounds the factor by which the norm of a signal changes during forward or backward propagation. Spectral norm regularization has also been shown to improve the generalization and robustness of deep networks. However, existing methods to compute the spectral norm of the jacobian of convolution layers either rely on heuristics (but are efficient in computation) or are exact (but computationally expensive to be used during training). In this work, we resolve these issues by deriving an upper bound on the spectral norm of a standard 2D multi-channel convolution layer. Our method provides a provable bound that is differentiable and can be computed efficiently during training with negligible overhead. We show that our spectral bound is an effective regularizer and can be used to bound the lipschitz constant and the curvature (eigenvalues of the Hessian) of neural network. Through experiments on MNIST and CIFAR-10, we demonstrate the effectiveness of our spectral bound in improving the generalization and provable robustness of deep networks against adversarial examples. Our code is available at <https://github.com/singlasahil14/CONV-SV>.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/13/2022

Efficiently Computing Local Lipschitz Constants of Neural Networks via Bound Propagation

Lipschitz constants are connected to many properties of neural networks,...
research
09/17/2020

Large Norms of CNN Layers Do Not Hurt Adversarial Robustness

Since the Lipschitz properties of convolutional neural network (CNN) are...
research
05/25/2023

Efficient Bound of Lipschitz Constant for Convolutional Layers by Gram Iteration

Since the control of the Lipschitz constant has a great impact on the tr...
research
06/27/2022

Exact Spectral Norm Regularization for Neural Networks

We pursue a line of research that seeks to regularize the spectral norm ...
research
12/07/2021

Spectral Complexity-scaled Generalization Bound of Complex-valued Neural Networks

Complex-valued neural networks (CVNNs) have been widely applied to vario...
research
06/14/2022

Flatten the Curve: Efficiently Training Low-Curvature Neural Networks

The highly non-linear nature of deep neural networks causes them to be s...
research
05/21/2021

ReduNet: A White-box Deep Network from the Principle of Maximizing Rate Reduction

This work attempts to provide a plausible theoretical framework that aim...

Please sign up or login with your details

Forgot password? Click here to reset