Convolution Aware Initialization

02/21/2017
by   Armen Aghajanyan, et al.
0

Initialization of parameters in deep neural networks has been shown to have a big impact on the performance of the networks (Mishkin & Matas, 2015). The initialization scheme devised by He et al, allowed convolution activations to carry a constrained mean which allowed deep networks to be trained effectively (He et al., 2015a). Orthogonal initializations and more generally orthogonal matrices in standard recurrent networks have been proved to eradicate the vanishing and exploding gradient problem (Pascanu et al., 2012). Majority of current initialization schemes do not take fully into account the intrinsic structure of the convolution operator. Using the duality of the Fourier transform and the convolution operator, Convolution Aware Initialization builds orthogonal filters in the Fourier space, and using the inverse Fourier transform represents them in the standard space. With Convolution Aware Initialization we noticed not only higher accuracy and lower loss, but faster convergence. We achieve new state of the art on the CIFAR10 dataset, and achieve close to state of the art on various other tasks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/09/2019

Butterfly-Net2: Simplified Butterfly-Net and Fourier Transform Initialization

Structured CNN designed using the prior information of problems potentia...
research
12/17/2020

Generalized gaussian bounds for discrete convolution powers

We prove a uniform generalized gaussian bound for the powers of a discre...
research
02/19/2019

On the Impact of the Activation Function on Deep Neural Networks Training

The weight initialization and the activation function of deep neural net...
research
07/10/2017

Checkerboard artifact free sub-pixel convolution: A note on sub-pixel convolution, resize convolution and convolution resize

The most prominent problem associated with the deconvolution layer is th...
research
11/29/2022

ButterflyNet2D: Bridging Classical Methods and Neural Network Methods in Image Processing

Both classical Fourier transform-based methods and neural network method...
research
08/16/2020

Adaptive Signal Variances: CNN Initialization Through Modern Architectures

Deep convolutional neural networks (CNN) have achieved the unwavering co...
research
12/12/2020

Revisiting "Qualitatively Characterizing Neural Network Optimization Problems"

We revisit and extend the experiments of Goodfellow et al. (2014), who s...

Please sign up or login with your details

Forgot password? Click here to reset