DeepAI
Log In Sign Up

Contrastive Triple Extraction with Generative Transformer

09/14/2020
by   Hongbin Ye, et al.
0

Triple extraction is an essential task in information extraction for natural language processing and knowledge graph construction. In this paper, we revisit the end-to-end triple extraction task for sequence generation. Since generative triple extraction may struggle to capture long-term dependencies and generate unfaithful triples, we introduce a novel model, contrastive triple extraction with a generative transformer. Specifically, we introduce a single shared transformer module for encoder-decoder-based generation. To generate faithful results, we propose a novel triplet contrastive training object. Moreover, We introduce two mechanisms to further improve model performance (i.e., batch-wise dynamic attention-masking and triple-wise calibration). Experimental results on three datasets (i.e., NYT, WebNLG, and MIE) show that our approach achieves better performance than that of baselines. Our code and datasets will be released after publication.

READ FULL TEXT

page 1

page 2

page 3

page 4

09/14/2021

KFCNet: Knowledge Filtering and Contrastive Learning Network for Generative Commonsense Reasoning

Pre-trained language models have led to substantial gains over a broad r...
03/05/2021

IOT: Instance-wise Layer Reordering for Transformer Structures

With sequentially stacked self-attention, (optional) encoder-decoder att...
12/25/2019

Explicit Sparse Transformer: Concentrated Attention Through Explicit Selection

Self-attention based Transformer has demonstrated the state-of-the-art p...
11/07/2021

Theme Transformer: Symbolic Music Generation with Theme-Conditioned Transformer

Attention-based Transformer models have been increasingly employed for a...
11/07/2022

Contrastive Learning enhanced Author-Style Headline Generation

Headline generation is a task of generating an appropriate headline for ...
04/04/2021

KnowGraph@IITK at SemEval-2021 Task 11: Building KnowledgeGraph for NLP Research

Research in Natural Language Processing is making rapid advances, result...
06/06/2021

Attend and Select: A Segment Attention based Selection Mechanism for Microblog Hashtag Generation

Automatic microblog hashtag generation can help us better and faster und...