Convolutional Normalization: Improving Deep Convolutional Network Robustness and Training

03/01/2021
by   Sheng Liu, et al.
13

Normalization techniques have become a basic component in modern convolutional neural networks (ConvNets). In particular, many recent works demonstrate that promoting the orthogonality of the weights helps train deep models and improve robustness. For ConvNets, most existing methods are based on penalizing or normalizing weight matrices derived from concatenating or flattening the convolutional kernels. These methods often destroy or ignore the benign convolutional structure of the kernels; therefore, they are often expensive or impractical for deep ConvNets. In contrast, we introduce a simple and efficient “convolutional normalization” method that can fully exploit the convolutional structure in the Fourier domain and serve as a simple plug-and-play module to be conveniently incorporated into any ConvNets. Our method is inspired by recent work on preconditioning methods for convolutional sparse coding and can effectively promote each layer's channel-wise isometry. Furthermore, we show that convolutional normalization can reduce the layerwise spectral norm of the weight matrices and hence improve the Lipschitzness of the network, leading to easier training and improved robustness for deep ConvNets. Applied to classification under noise corruptions and generative adversarial network (GAN), we show that convolutional normalization improves the robustness of common ConvNets such as ResNet and the performance of GAN. We verify our findings via extensive numerical experiments on CIFAR-10, CIFAR-100, and ImageNet.

READ FULL TEXT
research
02/27/2019

Equi-normalization of Neural Networks

Modern neural networks are over-parametrized. In particular, each rectif...
research
09/11/2018

Normalization in Training Deep Convolutional Neural Networks for 2D Bio-medical Semantic Segmentation

2D bio-medical semantic segmentation is important for surgical robotic v...
research
10/23/2022

Pushing the Efficiency Limit Using Structured Sparse Convolutions

Weight pruning is among the most popular approaches for compressing deep...
research
11/19/2018

Generalizable Adversarial Training via Spectral Normalization

Deep neural networks (DNNs) have set benchmarks on a wide array of super...
research
11/12/2022

ABCAS: Adaptive Bound Control of spectral norm as Automatic Stabilizer

Spectral Normalization is one of the best methods for stabilizing the tr...
research
10/06/2017

Projection Based Weight Normalization for Deep Neural Networks

Optimizing deep neural networks (DNNs) often suffers from the ill-condit...
research
05/28/2019

Network Deconvolution

Convolution is a central operation in Convolutional Neural Networks (CNN...

Please sign up or login with your details

Forgot password? Click here to reset