When does MAML Work the Best? An Empirical Study on Model-Agnostic Meta-Learning in NLP Applications

05/24/2020
by   Zequn Liu, et al.
0

Model-Agnostic Meta-Learning (MAML), a model-agnostic meta-learning method, is successfully employed in NLP applications including few-shot text classification and multi-domain low-resource language generation. Many impacting factors, including data quantity, similarity among tasks, and the balance between general language model and task-specific adaptation, can affect the performance of MAML in NLP, but few works have thoroughly studied them. In this paper, we conduct an empirical study to investigate these impacting factors and conclude when MAML works the best based on the experimental results.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/03/2018

On the Importance of Attention in Meta-Learning for Few-Shot Text Classification

Current deep learning based text classification methods are limited by t...
research
06/08/2022

Sharp-MAML: Sharpness-Aware Model-Agnostic Meta Learning

Model-agnostic meta learning (MAML) is currently one of the dominating a...
research
09/22/2020

An Empirical Study on Neural Keyphrase Generation

Recent years have seen a flourishing of neural keyphrase generation work...
research
03/22/2022

Improving Meta-learning for Low-resource Text Classification and Generation via Memory Imitation

Building models of natural language processing (NLP) is challenging in l...
research
06/12/2018

Learning to Automatically Generate Fill-In-The-Blank Quizzes

In this paper we formalize the problem automatic fill-in-the-blank quest...
research
09/10/2023

Retrieval-Augmented Meta Learning for Low-Resource Text Classification

Meta learning have achieved promising performance in low-resource text c...
research
03/07/2022

Language-Agnostic Meta-Learning for Low-Resource Text-to-Speech with Articulatory Features

While neural text-to-speech systems perform remarkably well in high-reso...

Please sign up or login with your details

Forgot password? Click here to reset