Curriculum-Meta Learning for Order-Robust Continual Relation Extraction

01/06/2021
by   Tongtong Wu, et al.
0

Continual relation extraction is an important task that focuses on extracting new facts incrementally from unstructured text. Given the sequential arrival order of the relations, this task is prone to two serious challenges, namely catastrophic forgetting and order-sensitivity. We propose a novel curriculum-meta learning method to tackle the above two challenges in continual relation extraction. We combine meta learning and curriculum learning to quickly adapt model parameters to a new task and to reduce interference of previously seen tasks on the current task. We design a novel relation representation learning method through the distribution of domain and range types of relations. Such representations are utilized to quantify the difficulty of tasks for the construction of curricula. Moreover, we also present novel difficulty-based metrics to quantitatively measure the extent of order-sensitivity of a given model, suggesting new ways to evaluate model robustness. Our comprehensive experiments on three benchmark datasets show that our proposed method outperforms the state-of-the-art techniques. The code is available at the anonymous GitHub repository: https://github.com/wutong8023/AAAI_CML.

READ FULL TEXT
research
10/10/2022

Improving Continual Relation Extraction through Prototypical Contrastive Learning

Continual relation extraction (CRE) aims to extract relations towards th...
research
03/05/2022

Consistent Representation Learning for Continual Relation Extraction

Continual relation extraction (CRE) aims to continuously train a model o...
research
05/08/2023

Enhancing Continual Relation Extraction via Classifier Decomposition

Continual relation extraction (CRE) models aim at handling emerging new ...
research
09/12/2021

Exploring Task Difficulty for Few-Shot Relation Extraction

Few-shot relation extraction (FSRE) focuses on recognizing novel relatio...
research
09/10/2020

Meta-Learning with Sparse Experience Replay for Lifelong Language Learning

Lifelong learning requires models that can continuously learn from seque...
research
10/10/2022

Learning Robust Representations for Continual Relation Extraction via Adversarial Class Augmentation

Continual relation extraction (CRE) aims to continually learn new relati...
research
05/17/2022

Generic and Trend-aware Curriculum Learning for Relation Extraction in Graph Neural Networks

We present a generic and trend-aware curriculum learning approach for gr...

Please sign up or login with your details

Forgot password? Click here to reset