Projection Based Weight Normalization for Deep Neural Networks

10/06/2017
by   Lei Huang, et al.
0

Optimizing deep neural networks (DNNs) often suffers from the ill-conditioned problem. We observe that the scaling-based weight space symmetry property in rectified nonlinear network will cause this negative effect. Therefore, we propose to constrain the incoming weights of each neuron to be unit-norm, which is formulated as an optimization problem over Oblique manifold. A simple yet efficient method referred to as projection based weight normalization (PBWN) is also developed to solve this problem. PBWN executes standard gradient updates, followed by projecting the updated weight back to Oblique manifold. This proposed method has the property of regularization and collaborates well with the commonly used batch normalization technique. We conduct comprehensive experiments on several widely-used image datasets including CIFAR-10, CIFAR-100, SVHN and ImageNet for supervised learning over the state-of-the-art convolutional neural networks, such as Inception, VGG and residual networks. The results show that our method is able to improve the performance of DNNs with different architectures consistently. We also apply our method to Ladder network for semi-supervised learning on permutation invariant MNIST dataset, and our method outperforms the state-of-the-art methods: we obtain test errors as 2.52 respectively.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/11/2021

Preprint: Norm Loss: An efficient yet effective regularization method for deep neural networks

Convolutional neural network training can suffer from diverse issues lik...
research
11/03/2015

Understanding symmetries in deep networks

Recent works have highlighted scale invariance or symmetry present in th...
research
03/05/2020

Permute to Train: A New Dimension to Training Deep Neural Networks

We show that Deep Neural Networks (DNNs) can be efficiently trained by p...
research
03/05/2023

Reparameterization through Spatial Gradient Scaling

Reparameterization aims to improve the generalization of deep neural net...
research
02/25/2020

Exploring Learning Dynamics of DNNs via Layerwise Conditioning Analysis

Conditioning analysis uncovers the landscape of optimization objective b...
research
10/07/2020

Robust Semi-Supervised Learning with Out of Distribution Data

Semi-supervised learning (SSL) based on deep neural networks (DNNs) has ...
research
03/01/2021

Convolutional Normalization: Improving Deep Convolutional Network Robustness and Training

Normalization techniques have become a basic component in modern convolu...

Please sign up or login with your details

Forgot password? Click here to reset