On Batch Orthogonalization Layers

12/07/2018
by   Blanchette, et al.
0

Batch normalization has become ubiquitous in many state-of-the-art nets. It accelerates training and yields good performance results. However, there are various other alternatives to normalization, e.g. orthonormalization. The objective of this paper is to explore the possible alternatives to channel normalization with orthonormalization layers. The performance of the algorithms are compared together with BN with prescribed performance measures.

READ FULL TEXT
research
06/09/2019

Four Things Everyone Should Know to Improve Batch Normalization

A key component of most neural network architectures is the use of norma...
research
06/14/2022

Understanding the Generalization Benefit of Normalization Layers: Sharpness Reduction

Normalization layers (e.g., Batch Normalization, Layer Normalization) we...
research
03/28/2022

To Fold or Not to Fold: a Necessary and Sufficient Condition on Batch-Normalization Layers Folding

Batch-Normalization (BN) layers have become fundamental components in th...
research
11/20/2017

A Novel Convolutional Neural Network for Image Steganalysis with Shared Normalization

Deep learning based image steganalysis has attracted increasing attentio...
research
09/15/2021

Behavior of Keyword Spotting Networks Under Noisy Conditions

Keyword spotting (KWS) is becoming a ubiquitous need with the advancemen...
research
05/15/2022

Effect of Batch Normalization on Noise Resistant Property of Deep Learning Models

The fast execution speed and energy efficiency of analog hardware has ma...
research
06/07/2023

Normalization Layers Are All That Sharpness-Aware Minimization Needs

Sharpness-aware minimization (SAM) was proposed to reduce sharpness of m...

Please sign up or login with your details

Forgot password? Click here to reset