Improving Relation Extraction with Knowledge-attention

10/07/2019
by   Kezhi Mao, et al.
0

While attention mechanisms have been proven to be effective in many NLP tasks, majority of them are data-driven. We propose a novel knowledge-attention encoder which incorporates prior knowledge from external lexical resources into deep neural networks for relation extraction task. Furthermore, we present three effective ways of integrating knowledge-attention with self-attention to maximize the utilization of both knowledge and data. The proposed relation extraction system is end-to-end and fully attention-based. Experiment results show that the proposed knowledge-attention mechanism has complementary strengths with self-attention, and our integrated models outperform existing CNN, RNN, and self-attention based models. State-of-the-art performance is achieved on TACRED, a complex and large-scale relation extraction dataset.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/20/2021

Entity Structure Within and Throughout: Modeling Mention Dependencies for Document-Level Relation Extraction

Entities, as the essential elements in relation extraction tasks, exhibi...
research
07/09/2018

Position-aware Self-attention with Relative Positional Encodings for Slot Filling

This paper describes how to apply self-attention with relative positiona...
research
06/09/2019

Attention-based Conditioning Methods for External Knowledge Integration

In this paper, we present a novel approach for incorporating external kn...
research
09/03/2018

Multi-Level Structured Self-Attentions for Distantly Supervised Relation Extraction

Attention mechanisms are often used in deep neural networks for distantl...
research
11/02/2022

Cross-stitching Text and Knowledge Graph Encoders for Distantly Supervised Relation Extraction

Bi-encoder architectures for distantly-supervised relation extraction ar...
research
10/18/2021

Finding Strong Gravitational Lenses Through Self-Attention

The upcoming large scale surveys are expected to find approximately 10^5...
research
04/28/2018

Data-Driven Methods for Solving Algebra Word Problems

We explore contemporary, data-driven techniques for solving math word pr...

Please sign up or login with your details

Forgot password? Click here to reset