Batch Normalization is a Cause of Adversarial Vulnerability

05/06/2019
by   Angus Galloway, et al.
0

Batch normalization (batch norm) is often used in an attempt to stabilize and accelerate training in deep neural networks. In many cases it indeed decreases the number of parameter updates required to reduce the training error. However, it also reduces robustness to small input perturbations and noise by double-digit percentages, as we show on five standard datasets. Furthermore, substituting weight decay for batch norm is sufficient to nullify the relationship between adversarial vulnerability and the input dimension. Our work is consistent with a mean-field analysis that found that batch norm causes exploding gradients.

READ FULL TEXT

page 4

page 8

research
06/19/2020

Towards an Adversarially Robust Normalization Approach

Batch Normalization (BatchNorm) is effective for improving the performan...
research
02/05/2018

Adversarial Vulnerability of Neural Networks Increases With Input Dimension

Over the past four years, neural networks have proven vulnerable to adve...
research
10/10/2020

Double Forward Propagation for Memorized Batch Normalization

Batch Normalization (BN) has been a standard component in designing deep...
research
10/07/2020

Batch Normalization Increases Adversarial Vulnerability: Disentangling Usefulness and Robustness of Model Features

Batch normalization (BN) has been widely used in modern deep neural netw...
research
10/21/2020

Is Batch Norm unique? An empirical investigation and prescription to emulate the best properties of common normalizers without batch dependence

We perform an extensive empirical study of the statistical properties of...
research
03/05/2018

Norm matters: efficient and accurate normalization schemes in deep networks

Over the past few years batch-normalization has been commonly used in de...
research
06/29/2021

On the Periodic Behavior of Neural Network Training with Batch Normalization and Weight Decay

Despite the conventional wisdom that using batch normalization with weig...

Please sign up or login with your details

Forgot password? Click here to reset