TTN: A Domain-Shift Aware Batch Normalization in Test-Time Adaptation

02/10/2023
by   Hyesu Lim, et al.
0

This paper proposes a novel batch normalization strategy for test-time adaptation. Recent test-time adaptation methods heavily rely on the modified batch normalization, i.e., transductive batch normalization (TBN), which calculates the mean and the variance from the current test batch rather than using the running mean and variance obtained from the source data, i.e., conventional batch normalization (CBN). Adopting TBN that employs test batch statistics mitigates the performance degradation caused by the domain shift. However, re-estimating normalization statistics using test data depends on impractical assumptions that a test batch should be large enough and be drawn from i.i.d. stream, and we observed that the previous methods with TBN show critical performance drop without the assumptions. In this paper, we identify that CBN and TBN are in a trade-off relationship and present a new test-time normalization (TTN) method that interpolates the statistics by adjusting the importance between CBN and TBN according to the domain-shift sensitivity of each BN layer. Our proposed TTN improves model robustness to shifted domains across a wide range of batch sizes and in various realistic evaluation scenarios. TTN is widely applicable to other test-time adaptation methods that rely on updating model parameters via backpropagation. We demonstrate that adopting TTN further improves their performance and achieves state-of-the-art performance in various standard benchmarks.

READ FULL TEXT
research
05/20/2022

Test-time Batch Normalization

Deep neural networks often suffer the data distribution shift between tr...
research
10/07/2020

Revisiting Batch Normalization for Improving Corruption Robustness

Modern deep neural networks (DNN) have demonstrated remarkable success i...
research
10/17/2022

Learning Less Generalizable Patterns with an Asymmetrically Trained Double Classifier for Better Test-Time Adaptation

Deep neural networks often fail to generalize outside of their training ...
research
10/05/2021

Distribution Mismatch Correction for Improved Robustness in Deep Neural Networks

Deep neural networks rely heavily on normalization methods to improve th...
research
08/30/2022

FUSION: Fully Unsupervised Test-Time Stain Adaptation via Fused Normalization Statistics

Staining reveals the micro structure of the aspirate while creating hist...
research
03/17/2020

Rethinking Batch Normalization in Transformers

The standard normalization method for neural network (NN) models used in...
research
07/27/2023

Test Time Adaptation for Blind Image Quality Assessment

While the design of blind image quality assessment (IQA) algorithms has ...

Please sign up or login with your details

Forgot password? Click here to reset