Revisiting Batch Normalization for Improving Corruption Robustness

10/07/2020
by   Philipp Benz, et al.
0

Modern deep neural networks (DNN) have demonstrated remarkable success in image recognition tasks when the test dataset and training dataset are from the same distribution. In practical applications, however, this assumption is often not valid and results in performance drop when there is a domain shift. For example, the performance of DNNs trained on clean images has been shown to decrease when the test images have common corruptions, limiting their use in performance-sensitive applications. In this work, we interpret corruption robustness as a domain shift problem and propose to rectify batch normalization (BN) statistics for improving model robustness. This shift from the clean domain to the corruption domain can be interpreted as a style shift that is represented by the BN statistics. Straightforwardly, adapting BN statistics is beneficial for rectifying this style shift. Specifically, we find that simply estimating and adapting the BN statistics on a few (32 for instance) representation samples, without retraining the model, improves the corruption robustness by a large margin on several benchmark datasets with a wide range of model architectures. For example, on ImageNet-C, statistics adaptation improves the top1 accuracy from 40.2 further improve state-of-the-art robust models from 59.0

READ FULL TEXT
research
02/10/2023

TTN: A Domain-Shift Aware Batch Normalization in Test-Time Adaptation

This paper proposes a novel batch normalization strategy for test-time a...
research
06/30/2020

Improving robustness against common corruptions by covariate shift adaptation

Today's state-of-the-art machine vision models are vulnerable to image c...
research
03/21/2022

Delving into the Estimation Shift of Batch Normalization in a Network

Batch normalization (BN) is a milestone technique in deep learning. It n...
research
10/06/2021

Test-time Batch Statistics Calibration for Covariate Shift

Deep neural networks have a clear degradation when applying to the unsee...
research
03/15/2016

Revisiting Batch Normalization For Practical Domain Adaptation

Deep neural networks (DNN) have shown unprecedented success in various c...
research
06/19/2020

Towards an Adversarially Robust Normalization Approach

Batch Normalization (BatchNorm) is effective for improving the performan...
research
03/10/2021

Limitations of Post-Hoc Feature Alignment for Robustness

Feature alignment is an approach to improving robustness to distribution...

Please sign up or login with your details

Forgot password? Click here to reset