MrGCN: Mirror Graph Convolution Network for Relation Extraction with Long-Term Dependencies

by   Xiao Guo, et al.

The ability to capture complex linguistic structures and long-term dependencies among words in the passage is essential for many natural language understanding tasks. In relation extraction, dependency trees that contain rich syntactic clues have been widely used to help capture long-term dependencies in text. Graph neural networks (GNNs), one of the means to encode dependency graphs, has been shown effective in several prior works. However, relatively little attention has been paid to the receptive fields of GNNs, which can be crucial in tasks with extremely long text that go beyond single sentences and require discourse analysis. In this work, we leverage the idea of graph pooling and propose the Mirror Graph Convolution Network (MrGCN), a GNN model with pooling-unpooling structures tailored to relation extraction. The pooling branch reduces the graph size and enables the GCN to obtain larger receptive fields within less layers; the unpooling branch restores the pooled graph to its original resolution such that token-level relation extraction can be performed. Experiments on two datasets demonstrate the effectiveness of our method, showing significant improvements over previous results.


page 2

page 8

page 9


Graph Neural Networks with Generated Parameters for Relation Extraction

Recently, progress has been made towards improving relational reasoning ...

Graph Convolution over Pruned Dependency Trees Improves Relation Extraction

Dependency trees help relation extraction models capture long-range rela...

BERT-GT: Cross-sentence n-ary relation extraction with BERT and Graph Transformer

A biomedical relation statement is commonly expressed in multiple senten...

Attention Guided Graph Convolutional Networks for Relation Extraction

Dependency trees convey rich structural information that is proven usefu...

Sentences with Gapping: Parsing and Reconstructing Elided Predicates

Sentences with gapping, such as Paul likes coffee and Mary tea, lack an ...

Generic and Trend-aware Curriculum Learning for Relation Extraction in Graph Neural Networks

We present a generic and trend-aware curriculum learning approach for gr...

Neural Sequence Segmentation as Determining the Leftmost Segments

Prior methods to text segmentation are mostly at token level. Despite th...