A Simple and Unified Tagging Model with Priming for Relational Structure Predictions

05/25/2022
by   I-Hung Hsu, et al.
0

Relational structure extraction covers a wide range of tasks and plays an important role in natural language processing. Recently, many approaches tend to design sophisticated graphical models to capture the complex relations between objects that are described in a sentence. In this work, we demonstrate that simple tagging models can surprisingly achieve competitive performances with a small trick – priming. Tagging models with priming append information about the operated objects to the input sequence of pretrained language model. Making use of the contextualized nature of pretrained language model, the priming approach help the contextualized representation of the sentence better embed the information about the operated objects, hence, becomes more suitable for addressing relational structure extraction. We conduct extensive experiments on three different tasks that span ten datasets across five different languages, and show that our model is a general and effective model, despite its simplicity. We further carry out comprehensive analysis to understand our model and propose an efficient approximation to our method, which can perform almost the same performance but with faster inference speed.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/22/2018

Same Representation, Different Attentions: Shareable Sentence Representation Learning from Multiple Tasks

Distributed representation plays an important role in deep learning base...
research
11/10/2019

CamemBERT: a Tasty French Language Model

Pretrained language models are now ubiquitous in Natural Language Proces...
research
01/09/2021

Trankit: A Light-Weight Transformer-based Toolkit for Multilingual Natural Language Processing

We introduce Trankit, a light-weight Transformer-based Toolkit for multi...
research
08/20/2022

Pretrained Language Encoders are Natural Tagging Frameworks for Aspect Sentiment Triplet Extraction

Aspect Sentiment Triplet Extraction (ASTE) aims to extract the spans of ...
research
06/04/2023

Sen2Pro: A Probabilistic Perspective to Sentence Embedding from Pre-trained Language Model

Sentence embedding is one of the most fundamental tasks in Natural Langu...
research
03/03/2021

Few-shot Learning for Slot Tagging with Attentive Relational Network

Metric-based learning is a well-known family of methods for few-shot lea...
research
03/16/2022

Transforming Sequence Tagging Into A Seq2Seq Task

Pretrained, large, generative language models (LMs) have had great succe...

Please sign up or login with your details

Forgot password? Click here to reset