Hybrid Attention-Based Transformer Block Model for Distant Supervision Relation Extraction

03/10/2020
by   Yan Xiao, et al.
0

With an exponential explosive growth of various digital text information, it is challenging to efficiently obtain specific knowledge from massive unstructured text information. As one basic task for natural language processing (NLP), relation extraction aims to extract the semantic relation between entity pairs based on the given text. To avoid manual labeling of datasets, distant supervision relation extraction (DSRE) has been widely used, aiming to utilize knowledge base to automatically annotate datasets. Unfortunately, this method heavily suffers from wrong labelling due to the underlying strong assumptions. To address this issue, we propose a new framework using hybrid attention-based Transformer block with multi-instance learning to perform the DSRE task. More specifically, the Transformer block is firstly used as the sentence encoder to capture syntactic information of sentences, which mainly utilizes multi-head self-attention to extract features from word level. Then, a more concise sentence-level attention mechanism is adopted to constitute the bag representation, aiming to incorporate valid information of each sentence to effectively represent the bag. Experimental results on the public dataset New York Times (NYT) demonstrate that the proposed approach can outperform the state-of-the-art algorithms on the evaluation dataset, which verifies the effectiveness of our model for the DSRE task.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/27/2019

Self-Attention Enhanced Selective Gate with Entity-Aware Embedding for Distantly Supervised Relation Extraction

Distantly supervised relation extraction intrinsically suffers from nois...
research
03/30/2019

Distant Supervision Relation Extraction with Intra-Bag and Inter-Bag Attentions

This paper presents a neural relation extraction method to deal with the...
research
04/29/2020

Distantly-Supervised Neural Relation Extraction with Side Information using BERT

Relation extraction (RE) consists in categorizing the relationship betwe...
research
06/17/2019

BERE: An accurate distantly supervised biomedical entity relation extraction network

Automated entity relation extraction (RE) from literature provides an im...
research
09/03/2018

Multi-Level Structured Self-Attentions for Distantly Supervised Relation Extraction

Attention mechanisms are often used in deep neural networks for distantl...
research
12/22/2018

Distant Supervision for Relation Extraction with Linear Attenuation Simulation and Non-IID Relevance Embedding

Distant supervision for relation extraction is an efficient method to re...
research
02/27/2022

HiCLRE: A Hierarchical Contrastive Learning Framework for Distantly Supervised Relation Extraction

Distant supervision assumes that any sentence containing the same entity...

Please sign up or login with your details

Forgot password? Click here to reset