What is Batch Normalization?
Batch Normalization is a supervised learning technique that converts interlayer outputs into of a neural network into a standard format, called normalizing. This effectively 'resets' the distribution of the output of the previous layer to be more efficiently processed by the subsequent layer.
What are the Advantages of Batch Normalization?
How Does Batch Normalization Work?
To enhance the stability of a deep learning network, batch normalization affects the output of the previous activation layer by subtracting the batch mean, and then dividing by the batch’s standard deviation.
Since this shifting or scaling of outputs by a randomly initialized parameter reduces the accuracy of the weights in the next layer, a stochastic gradient descent is applied to remove this normalization if the loss function is too high.