Linguistic representations for fewer-shot relation extraction across domains

07/07/2023
by   Sireesh Gururaja, et al.
0

Recent work has demonstrated the positive impact of incorporating linguistic representations as additional context and scaffolding on the in-domain performance of several NLP tasks. We extend this work by exploring the impact of linguistic representations on cross-domain performance in a few-shot transfer setting. An important question is whether linguistic representations enhance generalizability by providing features that function as cross-domain pivots. We focus on the task of relation extraction on three datasets of procedural text in two domains, cooking and materials science. Our approach augments a popular transformer-based architecture by alternately incorporating syntactic and semantic graphs constructed by freely available off-the-shelf tools. We examine their utility for enhancing generalization, and investigate whether earlier findings, e.g. that semantic representations can be more helpful than syntactic ones, extend to relation extraction in multiple domains. We find that while the inclusion of these graphs results in significantly higher performance in few-shot transfer, both types of graph exhibit roughly equivalent utility.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/02/2023

How to Unleash the Power of Large Language Models for Few-shot Relation Extraction?

Scaling language models have revolutionized widespread NLP tasks, yet li...
research
08/12/2017

Cross-Sentence N-ary Relation Extraction with Graph LSTMs

Past work in relation extraction has focused on binary relations in sing...
research
05/18/2023

Silver Syntax Pre-training for Cross-Domain Relation Extraction

Relation Extraction (RE) remains a challenging task, especially when con...
research
10/17/2022

CrossRE: A Cross-Domain Dataset for Relation Extraction

Relation Extraction (RE) has attracted increasing attention, but current...
research
04/17/2020

Probing Linguistic Features of Sentence-Level Representations in Neural Relation Extraction

Despite the recent progress, little is known about the features captured...
research
12/17/2019

The performance evaluation of Multi-representation in the Deep Learning models for Relation Extraction Task

Single implementing, concatenating, adding or replacing of the represent...
research
11/07/2020

NLP-CIC @ PRELEARN: Mastering prerequisites relations, from handcrafted features to embeddings

We present our systems and findings for the prerequisite relation learni...

Please sign up or login with your details

Forgot password? Click here to reset