Bounding Singular Values of Convolution Layers

11/22/2019
by   Sahil Singla, et al.
0

In deep neural networks, the spectral norm of the Jacobian of a layer bounds the factor by which the norm of a signal changes during forward or backward propagation. Spectral norm regularization has also been shown to improve the generalization and robustness of deep networks. However, existing methods to compute the spectral norm of the jacobian of convolution layers either rely on heuristics (but are efficient in computation) or are exact (but computationally expensive to be used during training). In this work, we resolve these issues by deriving an upper bound on the spectral norm of a standard 2D multi-channel convolution layer. Our method provides a provable bound that is differentiable and can be computed efficiently during training with negligible overhead. We show that our spectral bound is an effective regularizer and can be used to bound the lipschitz constant and the curvature (eigenvalues of the Hessian) of neural network. Through experiments on MNIST and CIFAR-10, we demonstrate the effectiveness of our spectral bound in improving the generalization and provable robustness of deep networks against adversarial examples. Our code is available at <https://github.com/singlasahil14/CONV-SV>.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset