Sandwich Batch Normalization

02/22/2021
by   Xinyu Gong, et al.
7

We present Sandwich Batch Normalization (SaBN), an embarrassingly easy improvement of Batch Normalization (BN) with only a few lines of code changes. SaBN is motivated by addressing the inherent feature distribution heterogeneity that one can be identified in many tasks, which can arise from data heterogeneity (multiple input domains) or model heterogeneity (dynamic architectures, model conditioning, etc.). Our SaBN factorizes the BN affine layer into one shared sandwich affine layer, cascaded by several parallel independent affine layers. Concrete analysis reveals that, during optimization, SaBN promotes balanced gradient norms while still preserving diverse gradient directions: a property that many application tasks seem to favor. We demonstrate the prevailing effectiveness of SaBN as a drop-in replacement in four tasks: conditional image generation, neural architecture search (NAS), adversarial training, and arbitrary style transfer. Leveraging SaBN immediately achieves better Inception Score and FID on CIFAR-10 and ImageNet conditional image generation with three state-of-the-art GANs; boosts the performance of a state-of-the-art weight-sharing NAS algorithm significantly on NAS-Bench-201; substantially improves the robust and standard accuracies for adversarial defense; and produces superior arbitrary stylized results. We also provide visualizations and analysis to help understand why SaBN works. Codes are available at https://github.com/VITA-Group/Sandwich-Batch-Normalization.

READ FULL TEXT

page 4

page 6

research
08/11/2019

AutoGAN: Neural Architecture Search for Generative Adversarial Networks

Neural architecture search (NAS) has witnessed prevailing success in ima...
research
07/31/2019

An Empirical Study of Batch Normalization and Group Normalization in Conditional Computation

Batch normalization has been widely used to improve optimization in deep...
research
11/30/2021

EAGAN: Efficient Two-stage Evolutionary Architecture Search for GANs

Generative Adversarial Networks (GANs) have been proven hugely successfu...
research
04/16/2021

"BNN - BN = ?": Training Binary Neural Networks without Batch Normalization

Batch normalization (BN) is a key facilitator and considered essential f...
research
08/22/2021

Pi-NAS: Improving Neural Architecture Search by Reducing Supernet Training Consistency Shift

Recently proposed neural architecture search (NAS) methods co-train bill...
research
12/04/2019

AdversarialNAS: Adversarial Neural Architecture Search for GANs

Neural Architecture Search (NAS) that aims to automate the procedure of ...

Please sign up or login with your details

Forgot password? Click here to reset