Convolution Aware Initialization

by   Armen Aghajanyan, et al.

Initialization of parameters in deep neural networks has been shown to have a big impact on the performance of the networks (Mishkin & Matas, 2015). The initialization scheme devised by He et al, allowed convolution activations to carry a constrained mean which allowed deep networks to be trained effectively (He et al., 2015a). Orthogonal initializations and more generally orthogonal matrices in standard recurrent networks have been proved to eradicate the vanishing and exploding gradient problem (Pascanu et al., 2012). Majority of current initialization schemes do not take fully into account the intrinsic structure of the convolution operator. Using the duality of the Fourier transform and the convolution operator, Convolution Aware Initialization builds orthogonal filters in the Fourier space, and using the inverse Fourier transform represents them in the standard space. With Convolution Aware Initialization we noticed not only higher accuracy and lower loss, but faster convergence. We achieve new state of the art on the CIFAR10 dataset, and achieve close to state of the art on various other tasks.


page 1

page 2

page 3

page 4


Butterfly-Net2: Simplified Butterfly-Net and Fourier Transform Initialization

Structured CNN designed using the prior information of problems potentia...

Generalized gaussian bounds for discrete convolution powers

We prove a uniform generalized gaussian bound for the powers of a discre...

On the Impact of the Activation Function on Deep Neural Networks Training

The weight initialization and the activation function of deep neural net...

Checkerboard artifact free sub-pixel convolution: A note on sub-pixel convolution, resize convolution and convolution resize

The most prominent problem associated with the deconvolution layer is th...

Adaptive Signal Variances: CNN Initialization Through Modern Architectures

Deep convolutional neural networks (CNN) have achieved the unwavering co...

Revisiting "Qualitatively Characterizing Neural Network Optimization Problems"

We revisit and extend the experiments of Goodfellow et al. (2014), who s...

Learning Multiple Categories on Deep Convolution Networks

Deep convolution networks have proved very successful with big datasets ...