Many-Class Few-Shot Learning on Multi-Granularity Class Hierarchy

06/28/2020
by   Lu Liu, et al.
5

We study many-class few-shot (MCFS) problem in both supervised learning and meta-learning settings. Compared to the well-studied many-class many-shot and few-class few-shot problems, the MCFS problem commonly occurs in practical applications but has been rarely studied in previous literature. It brings new challenges of distinguishing between many classes given only a few training samples per class. In this paper, we leverage the class hierarchy as a prior knowledge to train a coarse-to-fine classifier that can produce accurate predictions for MCFS problem in both settings. The propose model, "memory-augmented hierarchical-classification network (MahiNet)", performs coarse-to-fine classification where each coarse class can cover multiple fine classes. Since it is challenging to directly distinguish a variety of fine classes given few-shot data per class, MahiNet starts from learning a classifier over coarse-classes with more training data whose labels are much cheaper to obtain. The coarse classifier reduces the searching range over the fine classes and thus alleviates the challenges from "many classes". On architecture, MahiNet firstly deploys a convolutional neural network (CNN) to extract features. It then integrates a memory-augmented attention module and a multi-layer perceptron (MLP) together to produce the probabilities over coarse and fine classes. While the MLP extends the linear classifier, the attention module extends the KNN classifier, both together targeting the "few-shot" problem. We design several training strategies of MahiNet for supervised learning and meta-learning. In addition, we propose two novel benchmark datasets "mcfsImageNet" and "mcfsOmniglot" specially designed for MCFS problem. In experiments, we show that MahiNet outperforms several state-of-the-art models on MCFS problems in both supervised learning and meta-learning.

READ FULL TEXT

page 2

page 13

research
05/10/2019

Prototype Propagation Networks (PPN) for Weakly-supervised Few-shot Learning on Category Graph

A variety of machine learning applications expect to achieve rapid learn...
research
09/11/2019

Learning to Propagate for Graph Meta-Learning

Meta-learning extracts the common knowledge acquired from learning diffe...
research
06/30/2021

How to Train Your MAML to Excel in Few-Shot Classification

Model-agnostic meta-learning (MAML) is arguably the most popular meta-le...
research
04/09/2019

Few-Shot Learning with Localization in Realistic Settings

Traditional recognition methods typically require large, artificially-ba...
research
07/11/2020

Coarse-to-Fine Pseudo-Labeling Guided Meta-Learning for Few-Shot Classification

To endow neural networks with the potential to learn rapidly from a hand...
research
03/01/2022

SMTNet: Hierarchical cavitation intensity recognition based on sub-main transfer network

With the rapid development of smart manufacturing, data-driven machinery...
research
12/07/2020

Fine-grained Angular Contrastive Learning with Coarse Labels

Few-shot learning methods offer pre-training techniques optimized for ea...

Please sign up or login with your details

Forgot password? Click here to reset