DeepAI AI Chat
Log In Sign Up

Understanding Cross-Domain Few-Shot Learning: An Experimental Study

by   Jaehoon Oh, et al.

Cross-domain few-shot learning has drawn increasing attention for handling large differences between the source and target domains–an important concern in real-world scenarios. To overcome these large differences, recent works have considered exploiting small-scale unlabeled data from the target domain during the pre-training stage. This data enables self-supervised pre-training on the target domain, in addition to supervised pre-training on the source domain. In this paper, we empirically investigate scenarios under which it is advantageous to use each pre-training scheme, based on domain similarity and few-shot difficulty: performance gain of self-supervised pre-training over supervised pre-training increases when domain similarity is smaller or few-shot difficulty is lower. We further design two pre-training schemes, mixed-supervised and two-stage learning, that improve performance. In this light, we present seven findings for CD-FSL which are supported by extensive experiments and analyses on three source and eight target benchmark datasets with varying levels of domain similarity and few-shot difficulty. Our code is available at


page 6

page 14

page 15

page 22


Self-training for Few-shot Transfer Across Extreme Task Differences

All few-shot learning techniques must be pre-trained on a large, labeled...

Cross-domain few-shot learning with unlabelled data

Few shot learning aims to solve the data scarcity problem. If there is a...

Enabling the Network to Surf the Internet

Few-shot learning is challenging due to the limited data and labels. Exi...

Rethinking Few-Shot Object Detection on a Multi-Domain Benchmark

Most existing works on few-shot object detection (FSOD) focus on a setti...

CD-FSOD: A Benchmark for Cross-domain Few-shot Object Detection

Although few-shot object detection (FSOD) has attracted great research a...

Self-Supervised Contrastive Pre-Training For Time Series via Time-Frequency Consistency

Pre-training on time series poses a unique challenge due to the potentia...

Effect of large-scale pre-training on full and few-shot transfer learning for natural and medical images

Transfer learning aims to exploit pre-trained models for more efficient ...