Sandwich Batch Normalization

02/22/2021
by   Xinyu Gong, et al.
7

We present Sandwich Batch Normalization (SaBN), an embarrassingly easy improvement of Batch Normalization (BN) with only a few lines of code changes. SaBN is motivated by addressing the inherent feature distribution heterogeneity that one can be identified in many tasks, which can arise from data heterogeneity (multiple input domains) or model heterogeneity (dynamic architectures, model conditioning, etc.). Our SaBN factorizes the BN affine layer into one shared sandwich affine layer, cascaded by several parallel independent affine layers. Concrete analysis reveals that, during optimization, SaBN promotes balanced gradient norms while still preserving diverse gradient directions: a property that many application tasks seem to favor. We demonstrate the prevailing effectiveness of SaBN as a drop-in replacement in four tasks: conditional image generation, neural architecture search (NAS), adversarial training, and arbitrary style transfer. Leveraging SaBN immediately achieves better Inception Score and FID on CIFAR-10 and ImageNet conditional image generation with three state-of-the-art GANs; boosts the performance of a state-of-the-art weight-sharing NAS algorithm significantly on NAS-Bench-201; substantially improves the robust and standard accuracies for adversarial defense; and produces superior arbitrary stylized results. We also provide visualizations and analysis to help understand why SaBN works. Codes are available at https://github.com/VITA-Group/Sandwich-Batch-Normalization.

READ FULL TEXT

page 4

page 6

08/11/2019

AutoGAN: Neural Architecture Search for Generative Adversarial Networks

Neural architecture search (NAS) has witnessed prevailing success in ima...
07/31/2019

An Empirical Study of Batch Normalization and Group Normalization in Conditional Computation

Batch normalization has been widely used to improve optimization in deep...
11/30/2021

EAGAN: Efficient Two-stage Evolutionary Architecture Search for GANs

Generative Adversarial Networks (GANs) have been proven hugely successfu...
04/16/2021

"BNN - BN = ?": Training Binary Neural Networks without Batch Normalization

Batch normalization (BN) is a key facilitator and considered essential f...
06/01/2018

Whitening and Coloring transform for GANs

Batch Normalization (BN) is a common technique used both in discriminati...
12/04/2020

Batch Group Normalization

Deep Convolutional Neural Networks (DCNNs) are hard and time-consuming t...
01/29/2022

Task-Balanced Batch Normalization for Exemplar-based Class-Incremental Learning

Batch Normalization (BN) is an essential layer for training neural netwo...

Code Repositories

Sandwich-Batch-Normalization

[preprint] "Sandwich Batch Normalization" by Xinyu Gong, Wuyang Chen, Tianlong Chen and Zhangyang Wang


view repo