The Singular Values of Convolutional Layers

05/26/2018
by   Hanie Sedghi, et al.
0

We characterize the singular values of the linear transformation associated with a convolution applied to a two-dimensional feature map with multiple channels. Our characterization enables efficient computation of the singular values of convolutional layers used in popular deep neural network architectures. It also leads to an algorithm for projecting a convolutional layer onto the set of layers obeying a bound on the operator norm of the layer. We show that this is an effective regularizer; periodically applying these projections during training improves the test error of a residual network on CIFAR-10 from 6.2% to 5.3%.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/25/2019

A Frobenius norm regularization method for convolutional kernels to avoid unstable gradient problem

Convolutional neural network is a very important model of deep learning....
research
03/23/2016

A guide to convolution arithmetic for deep learning

We introduce a guide to help deep learning practitioners understand and ...
research
06/12/2020

Asymptotic Singular Value Distribution of Linear Convolutional Layers

In convolutional neural networks, the linear transformation of multi-cha...
research
11/24/2022

Towards Practical Control of Singular Values of Convolutional Layers

In general, convolutional neural networks (CNNs) are easy to train, but ...
research
03/17/2020

Hyperplane Arrangements of Trained ConvNets Are Biased

We investigate the geometric properties of the functions learned by trai...
research
06/10/2019

Network Implosion: Effective Model Compression for ResNets via Static Layer Pruning and Retraining

Residual Networks with convolutional layers are widely used in the field...
research
11/05/2017

The Local Dimension of Deep Manifold

Based on our observation that there exists a dramatic drop for the singu...

Please sign up or login with your details

Forgot password? Click here to reset