DeepAI
Log In Sign Up

Gradient Normalization & Depth Based Decay For Deep Learning

12/10/2017
by   Robert Kwiatkowski, et al.
0

In this paper we introduce a novel method of gradient normalization and decay with respect to depth. Our method leverages the simple concept of normalizing all gradients in a deep neural network, and then decaying said gradients with respect to their depth in the network. Our proposed normalization and decay techniques can be used in conjunction with most current state of the art optimizers and are a very simple addition to any network. This method, although simple, showed improvements in convergence time on state of the art networks such as DenseNet and ResNet on image classification tasks, as well as on an LSTM for natural language processing tasks.

READ FULL TEXT

page 1

page 2

page 3

page 4

06/05/2018

Stochastic Gradient Descent with Hyperbolic-Tangent Decay

Learning rate scheduler has been a critical issue in the deep neural net...
07/22/2019

Channel Normalization in Convolutional Neural Network avoids Vanishing Gradients

Normalization layers are widely used in deep neural networks to stabiliz...
07/13/2017

Be Careful What You Backpropagate: A Case For Linear Output Activations & Gradient Boosting

In this work, we show that saturating output activation functions, such ...
12/15/2017

Gradients explode - Deep Networks are shallow - ResNet explained

Whereas it is believed that techniques such as Adam, batch normalization...
10/18/2022

Hierarchical Normalization for Robust Monocular Depth Estimation

In this paper, we address monocular depth estimation with deep neural ne...
06/07/2021

SPANet: Generalized Permutationless Set Assignment for Particle Physics using Symmetry Preserving Attention

The creation of unstable heavy particles at the Large Hadron Collider is...
05/30/2015

Efficient combination of pairswise feature networks

This paper presents a novel method for the reconstruction of a neural ne...