Improving training of deep neural networks via Singular Value Bounding

11/18/2016
by   Kui Jia, et al.
0

Deep learning methods achieve great success recently on many computer vision problems, with image classification and object detection as the prominent examples. In spite of these practical successes, optimization of deep networks remains an active topic in deep learning research. In this work, we focus on investigation of the network solution properties that can potentially lead to good performance. Our research is inspired by theoretical and empirical results that use orthogonal matrices to initialize networks, but we are interested in investigating how orthogonal weight matrices perform when network training converges. To this end, we propose to constrain the solutions of weight matrices in the orthogonal feasible set during the whole process of network training, and achieve this by a simple yet effective method called Singular Value Bounding (SVB). In SVB, all singular values of each weight matrix are simply bounded in a narrow band around the value of 1. Based on the same motivation, we also propose Bounded Batch Normalization (BBN), which improves Batch Normalization by removing its potential risk of ill-conditioned layer transform. We present both theoretical and empirical results to justify our proposed methods. Experiments on benchmark image classification datasets show the efficacy of our proposed SVB and BBN. In particular, we achieve the state-of-the-art results of 3.06 using off-the-shelf network architectures (Wide ResNets). Our preliminary results on ImageNet also show the promise in large-scale learning.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/15/2019

Orthogonal Deep Neural Networks

In this paper, we introduce the algorithms of Orthogonal Deep Neural Net...
research
06/08/2022

Boundary between noise and information applied to filtering neural network weight matrices

Deep neural networks have been successfully applied to a broad range of ...
research
09/24/2017

Comparison of Batch Normalization and Weight Normalization Algorithms for the Large-scale Image Classification

Batch normalization (BN) has become a de facto standard for training dee...
research
08/14/2022

The SVD of Convolutional Weights: A CNN Interpretability Framework

Deep neural networks used for image classification often use convolution...
research
07/05/2022

Understanding and Improving Group Normalization

Various normalization layers have been proposed to help the training of ...
research
04/02/2020

Controllable Orthogonalization in Training DNNs

Orthogonality is widely used for training deep neural networks (DNNs) du...
research
11/27/2020

Improving Layer-wise Adaptive Rate Methods using Trust Ratio Clipping

Training neural networks with large batch is of fundamental significance...

Please sign up or login with your details

Forgot password? Click here to reset