Training BatchNorm and Only BatchNorm: On the Expressive Power of Random Features in CNNs

02/29/2020
by   Jonathan Frankle, et al.
0

Batch normalization (BatchNorm) has become an indispensable tool for training deep neural networks, yet it is still poorly understood. Although previous work has typically focused on its normalization component, BatchNorm also adds two per-feature trainable parameters: a coefficient and a bias. However, the role and expressive power of these parameters remains unclear. To study this question, we investigate the performance achieved when training only these parameters and freezing all others at their random initializations. We find that doing so leads to surprisingly high performance. For example, a sufficiently deep ResNet reaches 83 configuration. Interestingly, BatchNorm achieves this performance in part by naturally learning to disable around a third of the random features without any changes to the training objective. Not only do these results highlight the under-appreciated role of the affine parameters in BatchNorm, but - in a broader sense - they characterize the expressive power of neural networks constructed simply by shifting and rescaling random features.

READ FULL TEXT
research
02/15/2023

The Expressive Power of Tuning Only the Norm Layers

Feature normalization transforms such as Batch and Layer-Normalization h...
research
03/22/2023

An Empirical Analysis of the Shift and Scale Parameters in BatchNorm

Batch Normalization (BatchNorm) is a technique that improves the trainin...
research
04/26/2022

On Fragile Features and Batch Normalization in Adversarial Training

Modern deep learning architecture utilize batch normalization (BN) to st...
research
06/24/2020

Time for a Background Check! Uncovering the impact of Background Features on Deep Neural Networks

With increasing expressive power, deep neural networks have significantl...
research
11/21/2015

GradNets: Dynamic Interpolation Between Neural Architectures

In machine learning, there is a fundamental trade-off between ease of op...
research
04/20/2020

Towards Understanding Normalization in Neural ODEs

Normalization is an important and vastly investigated technique in deep ...
research
11/23/2022

Expressibility-Enhancing Strategies for Quantum Neural Networks

Quantum neural networks (QNNs), represented by parameterized quantum cir...

Please sign up or login with your details

Forgot password? Click here to reset