Log In Sign Up

Interventional Few-Shot Learning

by   Zhongqi Yue, et al.

We uncover an ever-overlooked deficiency in the prevailing Few-Shot Learning (FSL) methods: the pre-trained knowledge is indeed a confounder that limits the performance. This finding is rooted from our causal assumption: a Structural Causal Model (SCM) for the causalities among the pre-trained knowledge, sample features, and labels. Thanks to it, we propose a novel FSL paradigm: Interventional Few-Shot Learning (IFSL). Specifically, we develop three effective IFSL algorithmic implementations based on the backdoor adjustment, which is essentially a causal intervention towards the SCM of many-shot learning: the upper-bound of FSL in a causal view. It is worth noting that the contribution of IFSL is orthogonal to existing fine-tuning and meta-learning based FSL methods, hence IFSL can improve all of them, achieving a new 1-/5-shot state-of-the-art on miniImageNet, tieredImageNet, and cross-domain CUB. Code is released at


page 2

page 8


Few-shot learning using pre-training and shots, enriched by pre-trained samples

We use the EMNIST dataset of handwritten digits to test a simple approac...

Revisiting Few-Shot Learning from a Causal Perspective

Few-shot learning with N-way K-shot scheme is an open challenge in machi...

Large Margin Mechanism and Pseudo Query Set on Cross-Domain Few-Shot Learning

In recent years, few-shot learning problems have received a lot of atten...

Convolutional Ensembling based Few-Shot Defect Detection Technique

Over the past few years, there has been a significant improvement in the...

Honey or Poison? Solving the Trigger Curse in Few-shot Event Detection via Causal Intervention

Event detection has long been troubled by the trigger curse: overfitting...

On Measuring the Intrinsic Few-Shot Hardness of Datasets

While advances in pre-training have led to dramatic improvements in few-...