Revisiting Mid-Level Patterns for Distant-Domain Few-Shot Recognition

08/07/2020
by   Yixiong Zou, et al.
10

Cross-domain few-shot learning (FSL) is proposed recently to transfer knowledge from general-domain known classes (e.g., ImageNet) to novel classes in other domains, and recognize novel classes with only few training samples. In this paper, we go further to define a more challenging scenario that transfers knowledge from general-domain known classes to novel classes in distant domains which are significantly different from the general domain, e.g., medical data. To solve this challenging problem, we propose to exploit mid-level features, which are more transferable, yet under-explored in recent main-stream FSL works. To boost the discriminability of mid-level features, we propose a residual-prediction task for the training on known classes. In this task, we view the current training sample as a sample from a pseudo-novel class, so as to provide simulated novel-class data. However, this simulated data is from the same domain as known classes, and shares high-level patterns with other known classes. Therefore, we then use high-level patterns from other known classes to represent it and remove this high-level representation from the simulated data, outputting a residual term containing discriminative information of it that could not be represented by high-level patterns from other known classes. Then, mid-level features from multiple mid-layers are dynamically weighted to predict this residual term, which encourages the mid-level features to be discriminative. Notably, our method can be applied to both the regular in-domain FSL setting by emphasizing high-level transformed mid-level features and the distant-domain FSL setting by emphasizing mid-level features. Experiments under both settings on six public datasets (including two challenging medical datasets) validate the rationale of the proposed method, demonstrating state-of-the-art performance on both settings.

READ FULL TEXT
research
02/21/2017

Mimicking Ensemble Learning with Deep Branched Networks

This paper proposes a branched residual network for image classification...
research
02/14/2022

Discriminability-enforcing loss to improve representation learning

During the training process, deep neural networks implicitly learn to re...
research
11/08/2021

Enhancing Prototypical Few-Shot Learning by Leveraging the Local-Level Strategy

Aiming at recognizing the samples from novel categories with few referen...
research
09/14/2021

Improved Few-shot Segmentation by Redefinition of the Roles of Multi-level CNN Features

This study is concerned with few-shot segmentation, i.e., segmenting the...
research
10/09/2022

Adaptive Distribution Calibration for Few-Shot Learning with Hierarchical Optimal Transport

Few-shot classification aims to learn a classifier to recognize unseen c...
research
06/28/2022

SHELS: Exclusive Feature Sets for Novelty Detection and Continual Learning Without Class Boundaries

While deep neural networks (DNNs) have achieved impressive classificatio...
research
08/09/2023

High-Level Features Parallelization for Inference Cost Reduction Through Selective Attention

In this work, we parallelize high-level features in deep networks to sel...

Please sign up or login with your details

Forgot password? Click here to reset