Meta-Learning for Natural Language Understanding under Continual Learning Framework

11/03/2020
by   Jiacheng Wang, et al.
0

Neural network has been recognized with its accomplishments on tackling various natural language understanding (NLU) tasks. Methods have been developed to train a robust model to handle multiple tasks to gain a general representation of text. In this paper, we implement the model-agnostic meta-learning (MAML) and Online aware Meta-learning (OML) meta-objective under the continual framework for NLU tasks. We validate our methods on selected SuperGLUE and GLUE benchmark.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/27/2019

Investigating Meta-Learning Algorithms for Low-Resource Natural Language Understanding Tasks

Learning general representations of text is a fundamental problem for ma...
research
06/27/2022

Leveraging Language for Accelerated Learning of Tool Manipulation

Robust and generalized tool manipulation requires an understanding of th...
research
03/06/2023

Model-Agnostic Meta-Learning for Natural Language Understanding Tasks in Finance

Natural language understanding(NLU) is challenging for finance due to th...
research
01/27/2023

Meta Temporal Point Processes

A temporal point process (TPP) is a stochastic process where its realiza...
research
06/03/2018

On the Importance of Attention in Meta-Learning for Few-Shot Text Classification

Current deep learning based text classification methods are limited by t...
research
05/03/2022

Meta Learning for Natural Language Processing: A Survey

Deep learning has been the mainstream technique in natural language proc...
research
06/06/2021

Meta-learning for downstream aware and agnostic pretraining

Neural network pretraining is gaining attention due to its outstanding p...

Please sign up or login with your details

Forgot password? Click here to reset