An Adaptive Batch Normalization in Deep Learning

11/03/2022
by   Wael Alsobhi, et al.
0

Batch Normalization (BN) is a way to accelerate and stabilize training in deep convolutional neural networks. However, the BN works continuously within the network structure, although some training data may not always require it. In this research work, we propose a threshold-based adaptive BN approach that separates the data that requires the BN and data that does not require it. The experimental evaluation demonstrates that proposed approach achieves better performance mostly in small batch sizes than the traditional BN using MNIST, Fashion-MNIST, CIFAR-10, and CIFAR-100. It also reduces the occurrence of internal variable transformation to increase network stability

READ FULL TEXT

page 4

page 5

page 6

research
12/06/2017

AdaBatch: Adaptive Batch Sizes for Training Deep Neural Networks

Training deep neural networks with Stochastic Gradient Descent, or its v...
research
04/23/2018

Decorrelated Batch Normalization

Batch Normalization (BN) is capable of accelerating the training of deep...
research
05/28/2019

Network Deconvolution

Convolution is a central operation in Convolutional Neural Networks (CNN...
research
02/13/2018

Uncertainty Estimation via Stochastic Batch Normalization

In this work, we investigate Batch Normalization technique and propose i...
research
10/27/2019

Inherent Weight Normalization in Stochastic Neural Networks

Multiplicative stochasticity such as Dropout improves the robustness and...
research
06/08/2020

Passive Batch Injection Training Technique: Boosting Network Performance by Injecting Mini-Batches from a different Data Distribution

This work presents a novel training technique for deep neural networks t...
research
12/15/2020

SPOC learner's final grade prediction based on a novel sampling batch normalization embedded neural network method

Recent years have witnessed the rapid growth of Small Private Online Cou...

Please sign up or login with your details

Forgot password? Click here to reset