Scaling-up Diverse Orthogonal Convolutional Networks with a Paraunitary Framework

06/16/2021
by   Jiahao Su, et al.
2

Enforcing orthogonality in neural networks is an antidote for gradient vanishing/exploding problems, sensitivity by adversarial perturbation, and bounding generalization errors. However, many previous approaches are heuristic, and the orthogonality of convolutional layers is not systematically studied: some of these designs are not exactly orthogonal, while others only consider standard convolutional layers and propose specific classes of their realizations. To address this problem, we propose a theoretical framework for orthogonal convolutional layers, which establishes the equivalence between various orthogonal convolutional layers in the spatial domain and the paraunitary systems in the spectral domain. Since there exists a complete spectral factorization of paraunitary systems, any orthogonal convolution layer can be parameterized as convolutions of spatial filters. Our framework endows high expressive power to various convolutional layers while maintaining their exact orthogonality. Furthermore, our layers are memory and computationally efficient for deep networks compared to previous designs. Our versatile framework, for the first time, enables the study of architecture designs for deep orthogonal networks, such as choices of skip connection, initialization, stride, and dilation. Consequently, we scale up orthogonal networks to deep architectures, including ResNet, WideResNet, and ShuffleNet, substantially increasing the performance over the traditional shallow orthogonal networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/14/2021

Orthogonalizing Convolutional Layers with the Cayley Transform

Recent work has highlighted several advantages of enforcing orthogonalit...
research
08/12/2021

Existence, Stability And Scalability Of Orthogonal Convolutional Neural Networks

Imposing orthogonal transformations between layers of a neural network h...
research
03/30/2016

Image Restoration Using Very Deep Convolutional Encoder-Decoder Networks with Symmetric Skip Connections

In this paper, we propose a very deep fully convolutional encoding-decod...
research
09/15/2023

Make Deep Networks Shallow Again

Deep neural networks have a good success record and are thus viewed as t...
research
10/23/2018

DropFilter: Dropout for Convolutions

Using a large number of parameters , deep neural networks have achieved ...
research
03/05/2023

Reparameterization through Spatial Gradient Scaling

Reparameterization aims to improve the generalization of deep neural net...
research
11/27/2020

A Study on the Uncertainty of Convolutional Layers in Deep Neural Networks

This paper shows a Min-Max property existing in the connection weights o...

Please sign up or login with your details

Forgot password? Click here to reset