Sandwich Batch Normalization

by   Xinyu Gong, et al.

We present Sandwich Batch Normalization (SaBN), an embarrassingly easy improvement of Batch Normalization (BN) with only a few lines of code changes. SaBN is motivated by addressing the inherent feature distribution heterogeneity that one can be identified in many tasks, which can arise from data heterogeneity (multiple input domains) or model heterogeneity (dynamic architectures, model conditioning, etc.). Our SaBN factorizes the BN affine layer into one shared sandwich affine layer, cascaded by several parallel independent affine layers. Concrete analysis reveals that, during optimization, SaBN promotes balanced gradient norms while still preserving diverse gradient directions: a property that many application tasks seem to favor. We demonstrate the prevailing effectiveness of SaBN as a drop-in replacement in four tasks: conditional image generation, neural architecture search (NAS), adversarial training, and arbitrary style transfer. Leveraging SaBN immediately achieves better Inception Score and FID on CIFAR-10 and ImageNet conditional image generation with three state-of-the-art GANs; boosts the performance of a state-of-the-art weight-sharing NAS algorithm significantly on NAS-Bench-201; substantially improves the robust and standard accuracies for adversarial defense; and produces superior arbitrary stylized results. We also provide visualizations and analysis to help understand why SaBN works. Codes are available at


page 4

page 6


AutoGAN: Neural Architecture Search for Generative Adversarial Networks

Neural architecture search (NAS) has witnessed prevailing success in ima...

An Empirical Study of Batch Normalization and Group Normalization in Conditional Computation

Batch normalization has been widely used to improve optimization in deep...

EAGAN: Efficient Two-stage Evolutionary Architecture Search for GANs

Generative Adversarial Networks (GANs) have been proven hugely successfu...

"BNN - BN = ?": Training Binary Neural Networks without Batch Normalization

Batch normalization (BN) is a key facilitator and considered essential f...

Pi-NAS: Improving Neural Architecture Search by Reducing Supernet Training Consistency Shift

Recently proposed neural architecture search (NAS) methods co-train bill...

AdversarialNAS: Adversarial Neural Architecture Search for GANs

Neural Architecture Search (NAS) that aims to automate the procedure of ...

Please sign up or login with your details

Forgot password? Click here to reset