Contrastive Triple Extraction with Generative Transformer

09/14/2020
by   Hongbin Ye, et al.
0

Triple extraction is an essential task in information extraction for natural language processing and knowledge graph construction. In this paper, we revisit the end-to-end triple extraction task for sequence generation. Since generative triple extraction may struggle to capture long-term dependencies and generate unfaithful triples, we introduce a novel model, contrastive triple extraction with a generative transformer. Specifically, we introduce a single shared transformer module for encoder-decoder-based generation. To generate faithful results, we propose a novel triplet contrastive training object. Moreover, We introduce two mechanisms to further improve model performance (i.e., batch-wise dynamic attention-masking and triple-wise calibration). Experimental results on three datasets (i.e., NYT, WebNLG, and MIE) show that our approach achieves better performance than that of baselines. Our code and datasets will be released after publication.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/14/2021

KFCNet: Knowledge Filtering and Contrastive Learning Network for Generative Commonsense Reasoning

Pre-trained language models have led to substantial gains over a broad r...
research
03/05/2021

IOT: Instance-wise Layer Reordering for Transformer Structures

With sequentially stacked self-attention, (optional) encoder-decoder att...
research
12/25/2019

Explicit Sparse Transformer: Concentrated Attention Through Explicit Selection

Self-attention based Transformer has demonstrated the state-of-the-art p...
research
11/07/2021

Theme Transformer: Symbolic Music Generation with Theme-Conditioned Transformer

Attention-based Transformer models have been increasingly employed for a...
research
04/04/2021

KnowGraph@IITK at SemEval-2021 Task 11: Building KnowledgeGraph for NLP Research

Research in Natural Language Processing is making rapid advances, result...
research
06/03/2023

A Conditional Generative Chatbot using Transformer Model

A Chatbot serves as a communication tool between a human user and a mach...
research
06/06/2021

Attend and Select: A Segment Attention based Selection Mechanism for Microblog Hashtag Generation

Automatic microblog hashtag generation can help us better and faster und...

Please sign up or login with your details

Forgot password? Click here to reset