Coarse-to-Fine Pseudo-Labeling Guided Meta-Learning for Few-Shot Classification

07/11/2020
by   Jinhai Yang, et al.
13

To endow neural networks with the potential to learn rapidly from a handful of samples, meta-learning blazes a trail to acquire across-task knowledge from a variety of few-shot learning tasks. However, most existing meta-learning algorithms retain the requirement of fine-grained supervision, which is expensive in many applications. In this paper, we show that meta-learning models can extract transferable knowledge from coarse-grained supervision for few-shot classification. We propose a weakly-supervised framework, namely Coarse-to-fine Pseudo-labeling Guided Meta-Learning (CPGML), to alleviate the need for data annotation. In our framework, the coarse-categories are grouped into pseudo sub-categories to construct a task distribution for meta-training, following the cosine distance between the corresponding embedding vectors of images. For better feature representation in this process, we develop Dual-level Discriminative Embedding (DDE) aiming to keep the distance between learned embeddings consistent with the visual similarity and semantic relation of input images simultaneously. Moreover, we propose a task-attention mechanism to reduce the weight of the training tasks with potentially higher label noises based on the observation of task-nonequivalence. Extensive experiments conducted on two hierarchical meta-learning benchmarks demonstrate that, under the proposed framework, meta-learning models can effectively extract task-independent knowledge from the roughly-generated tasks and generalize well to unseen tasks.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset
Success!
Error Icon An error occurred

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro