Backward Gradient Normalization in Deep Neural Networks

06/17/2021
by   Alejandro Cabana, et al.
0

We introduce a new technique for gradient normalization during neural network training. The gradients are rescaled during the backward pass using normalization layers introduced at certain points within the network architecture. These normalization nodes do not affect forward activity propagation, but modify backpropagation equations to permit a well-scaled gradient flow that reaches the deepest network layers without experimenting vanishing or explosion. Results on tests with very deep neural networks show that the new technique can do an effective control of the gradient norm, allowing the update of weights in the deepest layers and improving network accuracy on several experimental conditions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/02/2018

Analysis on Gradient Propagation in Batch Normalized Residual Networks

We conduct mathematical analysis on the effect of batch normalization (B...
research
07/22/2019

Channel Normalization in Convolutional Neural Network avoids Vanishing Gradients

Normalization layers are widely used in deep neural networks to stabiliz...
research
11/09/2022

Designing Network Design Strategies Through Gradient Path Analysis

Designing a high-efficiency and high-quality expressive network architec...
research
02/24/2020

Breaking Batch Normalization for better explainability of Deep Neural Networks through Layer-wise Relevance Propagation

The lack of transparency of neural networks stays a major break for thei...
research
02/07/2023

On the Ideal Number of Groups for Isometric Gradient Propagation

Recently, various normalization layers have been proposed to stabilize t...
research
11/21/2019

Volume-preserving Neural Networks: A Solution to the Vanishing Gradient Problem

We propose a novel approach to addressing the vanishing (or exploding) g...
research
11/24/2021

Softmax Gradient Tampering: Decoupling the Backward Pass for Improved Fitting

We introduce Softmax Gradient Tampering, a technique for modifying the g...

Please sign up or login with your details

Forgot password? Click here to reset