Learning to Propagate for Graph Meta-Learning

09/11/2019
by   Lu Liu, et al.
3

Meta-learning extracts the common knowledge acquired from learning different tasks and uses it for unseen tasks. It demonstrates a clear advantage on tasks that have insufficient training data, e.g., few-shot learning. In most meta-learning methods, tasks are implicitly related via the shared model or optimizer. In this paper, we show that a meta-learner that explicitly relates tasks on a graph describing the relations of their output dimensions (e.g., classes) can significantly improve the performance of few-shot learning. This type of graph is usually free or cheap to obtain but has rarely been explored in previous works. We study the prototype based few-shot classification, in which a prototype is generated for each class, such that the nearest neighbor search between the prototypes produces an accurate classification. We introduce "Gated Propagation Network (GPN)", which learns to propagate messages between prototypes of different classes on the graph, so that learning the prototype of each class benefits from the data of other related classes. In GPN, an attention mechanism is used for the aggregation of messages from neighboring classes, and a gate is deployed to choose between the aggregated messages and the message from the class itself. GPN is trained on a sequence of tasks from many-shot to few-shot generated by subgraph sampling. During training, it is able to reuse and update previously achieved prototypes from the memory in a life-long learning cycle. In experiments, we change the training-test discrepancy and test task generation settings for thorough evaluations. GPN outperforms recent meta-learning methods on two benchmark datasets in all studied cases.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/10/2019

Prototype Propagation Networks (PPN) for Weakly-supervised Few-shot Learning on Category Graph

A variety of machine learning applications expect to achieve rapid learn...
research
09/24/2020

Attribute Propagation Network for Graph Zero-shot Learning

The goal of zero-shot learning (ZSL) is to train a model to classify sam...
research
06/28/2020

Many-Class Few-Shot Learning on Multi-Granularity Class Hierarchy

We study many-class few-shot (MCFS) problem in both supervised learning ...
research
08/22/2018

Learning to Support: Exploiting Structure Information in Support Sets for One-Shot Learning

Deep Learning shows very good performance when trained on large labeled ...
research
01/27/2021

Edge-Labeling based Directed Gated Graph Network for Few-shot Learning

Existing graph-network-based few-shot learning methods obtain similarity...
research
06/26/2023

ProtoDiff: Learning to Learn Prototypical Networks by Task-Guided Diffusion

Prototype-based meta-learning has emerged as a powerful technique for ad...
research
03/26/2021

MetaNODE: Prototype Optimization as a Neural ODE for Few-Shot Learning

Few-Shot Learning (FSL) is a challenging task, i.e., how to recognize no...

Please sign up or login with your details

Forgot password? Click here to reset