DeepAI AI Chat
Log In Sign Up

A simple normalization technique using window statistics to improve the out-of-distribution generalization in medical images

by   Chengfeng Zhou, et al.
Zhejiang University
Shanghai Jiao Tong University
Tencent QQ
NetEase, Inc
FUDAN University

Since data scarcity and data heterogeneity are prevailing for medical images, well-trained Convolutional Neural Networks (CNNs) using previous normalization methods may perform poorly when deployed to a new site. However, a reliable model for real-world applications should be able to generalize well both on in-distribution (IND) and out-of-distribution (OOD) data (e.g., the new site data). In this study, we present a novel normalization technique called window normalization (WIN), which is a simple yet effective alternative to existing normalization methods. Specifically, WIN perturbs the normalizing statistics with the local statistics computed on a window of features. This feature-level augmentation technique regularizes the models well and improves their OOD generalization significantly. Taking its advantage, we propose a novel self-distillation method called WIN-WIN to further improve the OOD generalization in classification. WIN-WIN is easily implemented with twice forward passes and a consistency constraint, which can be a simple extension for existing methods. Extensive experimental results on various tasks (such as glaucoma detection, breast cancer detection, chromosome classification, optic disc and cup segmentation, etc.) and datasets (26 datasets) demonstrate the generality and effectiveness of our methods. The code is available at


page 1

page 5

page 8


Generalized Lightness Adaptation with Channel Selective Normalization

Lightness adaptation is vital to the success of image processing to avoi...

Adversarial Feature Augmentation and Normalization for Visual Recognition

Recent advances in computer vision take advantage of adversarial data au...

DualNorm-UNet: Incorporating Global and Local Statistics for Robust Medical Image Segmentation

Batch Normalization (BN) is one of the key components for accelerating n...

TFS-ViT: Token-Level Feature Stylization for Domain Generalization

Standard deep learning models such as convolutional neural networks (CNN...

Local Context Normalization: Revisiting Local Normalization

Normalization layers have been shown to improve convergence in deep neur...

SelfNorm and CrossNorm for Out-of-Distribution Robustness

Normalization techniques are crucial in stabilizing and accelerating the...

Fake It Till You Make It: Near-Distribution Novelty Detection by Score-Based Generative Models

We aim for image-based novelty detection. Despite considerable progress,...