Log In Sign Up

Graph Convolution over Pruned Dependency Trees Improves Relation Extraction

by   Yuhao Zhang, et al.

Dependency trees help relation extraction models capture long-range relations between words. However, existing dependency-based models either neglect crucial information (e.g., negation) by pruning the dependency trees too aggressively, or are computationally inefficient because it is difficult to parallelize over different tree structures. We propose an extension of graph convolutional networks that is tailored for relation extraction, which pools information over arbitrary dependency structures efficiently in parallel. To incorporate relevant information while maximally removing irrelevant content, we further apply a novel pruning strategy to the input trees by keeping words immediately around the shortest path between the two entities among which a relation might hold. The resulting model achieves state-of-the-art performance on the large-scale TACRED dataset, outperforming existing sequence and dependency-based neural models. We also show through detailed analysis that this model has complementary strengths to sequence models, and combining them further improves the state of the art.


page 1

page 2

page 3

page 4


Attention Guided Graph Convolutional Networks for Relation Extraction

Dependency trees convey rich structural information that is proven usefu...

Efficient long-distance relation extraction with DG-SpanBERT

In natural language processing, relation extraction seeks to rationally ...

MrGCN: Mirror Graph Convolution Network for Relation Extraction with Long-Term Dependencies

The ability to capture complex linguistic structures and long-term depen...

Simple BERT Models for Relation Extraction and Semantic Role Labeling

We present simple BERT-based models for relation extraction and semantic...

Neural Open Relation Extraction via an Overlap-aware Sequence Tagging Scheme

Solving the Open relation extraction (ORE) task with supervised neural n...