Few-Shot Adaptation of Pre-Trained Networks for Domain Shift

05/30/2022
by   Wenyu Zhang, et al.
0

Deep networks are prone to performance degradation when there is a domain shift between the source (training) data and target (test) data. Recent test-time adaptation methods update batch normalization layers of pre-trained source models deployed in new target environments with streaming data to mitigate such performance degradation. Although such methods can adapt on-the-fly without first collecting a large target domain dataset, their performance is dependent on streaming conditions such as mini-batch size and class-distribution, which can be unpredictable in practice. In this work, we propose a framework for few-shot domain adaptation to address the practical challenges of data-efficient adaptation. Specifically, we propose a constrained optimization of feature normalization statistics in pre-trained source models supervised by a small support set from the target domain. Our method is easy to implement and improves source model performance with as few as one sample per class for classification tasks. Extensive experiments on 5 cross-domain classification and 4 semantic segmentation datasets show that our method achieves more accurate and reliable performance than test-time adaptation, while not being constrained by streaming conditions.

READ FULL TEXT

page 6

page 13

research
10/06/2021

Test-time Batch Statistics Calibration for Covariate Shift

Deep neural networks have a clear degradation when applying to the unsee...
research
12/05/2022

Addressing Distribution Shift at Test Time in Pre-trained Language Models

State-of-the-art pre-trained language models (PLMs) outperform other mod...
research
04/03/2019

Image Generation from Small Datasets via Batch Statistics Adaptation

Thanks to the recent development of deep generative models, it is becomi...
research
03/30/2022

Learning Instance-Specific Adaptation for Cross-Domain Segmentation

We propose a test-time adaptation method for cross-domain image segmenta...
research
02/13/2023

Finetuning Is a Surprisingly Effective Domain Adaptation Baseline in Handwriting Recognition

In many machine learning tasks, a large general dataset and a small spec...
research
07/24/2022

Improving Test-Time Adaptation via Shift-agnostic Weight Regularization and Nearest Source Prototypes

This paper proposes a novel test-time adaptation strategy that adjusts t...
research
08/30/2022

FUSION: Fully Unsupervised Test-Time Stain Adaptation via Fused Normalization Statistics

Staining reveals the micro structure of the aspirate while creating hist...

Please sign up or login with your details

Forgot password? Click here to reset