Multi-Level Structured Self-Attentions for Distantly Supervised Relation Extraction

09/03/2018
by   Jinhua Du, et al.
0

Attention mechanisms are often used in deep neural networks for distantly supervised relation extraction (DS-RE) to distinguish valid from noisy instances. However, traditional 1-D vector attention models are insufficient for the learning of different contexts in the selection of valid instances to predict the relationship for an entity pair. To alleviate this issue, we propose a novel multi-level structured (2-D matrix) self-attention mechanism for DS-RE in a multi-instance learning (MIL) framework using bidirectional recurrent neural networks. In the proposed method, a structured word-level self-attention mechanism learns a 2-D matrix where each row vector represents a weight distribution for different aspects of an instance regarding two entities. Targeting the MIL issue, the structured sentence-level attention learns a 2-D matrix where each row vector represents a weight distribution on selection of different valid in-stances. Experiments conducted on two publicly available DS-RE datasets show that the proposed framework with a multi-level structured self-attention mechanism significantly outperform state-of-the-art baselines in terms of PR curves, P@N and F1 measures.

READ FULL TEXT
research
02/20/2021

Entity Structure Within and Throughout: Modeling Mention Dependencies for Document-Level Relation Extraction

Entities, as the essential elements in relation extraction tasks, exhibi...
research
10/07/2019

Improving Relation Extraction with Knowledge-attention

While attention mechanisms have been proven to be effective in many NLP ...
research
11/27/2019

Self-Attention Enhanced Selective Gate with Entity-Aware Embedding for Distantly Supervised Relation Extraction

Distantly supervised relation extraction intrinsically suffers from nois...
research
03/10/2020

Hybrid Attention-Based Transformer Block Model for Distant Supervision Relation Extraction

With an exponential explosive growth of various digital text information...
research
03/09/2017

A Structured Self-attentive Sentence Embedding

This paper proposes a new model for extracting an interpretable sentence...
research
02/28/2018

Simultaneously Self-Attending to All Mentions for Full-Abstract Biological Relation Extraction

Most work in relation extraction forms a prediction by looking at a shor...
research
07/26/2021

How Knowledge Graph and Attention Help? A Quantitative Analysis into Bag-level Relation Extraction

Knowledge Graph (KG) and attention mechanism have been demonstrated effe...

Please sign up or login with your details

Forgot password? Click here to reset