DeepAI AI Chat
Log In Sign Up

Generating Pseudo-labels Adaptively for Few-shot Model-Agnostic Meta-Learning

07/09/2022
by   Guodong Liu, et al.
0

Model-Agnostic Meta-Learning (MAML) is a famous few-shot learning method that has inspired many follow-up efforts, such as ANIL and BOIL. However, as an inductive method, MAML is unable to fully utilize the information of query set, limiting its potential of gaining higher generality. To address this issue, we propose a simple yet effective method that generates psuedo-labels adaptively and could boost the performance of the MAML family. The proposed methods, dubbed Generative Pseudo-label based MAML (GP-MAML), GP-ANIL and GP-BOIL, leverage statistics of the query set to improve the performance on new tasks. Specifically, we adaptively add pseudo labels and pick samples from the query set, then re-train the model using the picked query samples together with the support set. The GP series can also use information from the pseudo query set to re-train the network during the meta-testing. While some transductive methods, such as Transductive Propagation Network (TPN), struggle to achieve this goal.

READ FULL TEXT

page 1

page 2

page 3

page 4

04/21/2023

Task-Adaptive Pseudo Labeling for Transductive Meta-Learning

Meta-learning performs adaptation through a limited amount of support se...
09/12/2020

Few-shot Learning with LSSVM Base Learner and Transductive Modules

The performance of meta-learning approaches for few-shot learning genera...
12/27/2022

Self Meta Pseudo Labels: Meta Pseudo Labels Without The Teacher

We present Self Meta Pseudo Labels, a novel semi-supervised learning met...
07/26/2020

Don't Overlook the Support Set: Towards Improving Generalization in Meta-learning

Meta-learning has proven to be a powerful paradigm for transferring the ...
07/12/2021

Few-shot Learning with Global Relatedness Decoupled-Distillation

Despite the success that metric learning based approaches have achieved ...
10/22/2022

MetaASSIST: Robust Dialogue State Tracking with Meta Learning

Existing dialogue datasets contain lots of noise in their state annotati...