MrGCN: Mirror Graph Convolution Network for Relation Extraction with Long-Term Dependencies

01/01/2021
by   Xiao Guo, et al.
0

The ability to capture complex linguistic structures and long-term dependencies among words in the passage is essential for many natural language understanding tasks. In relation extraction, dependency trees that contain rich syntactic clues have been widely used to help capture long-term dependencies in text. Graph neural networks (GNNs), one of the means to encode dependency graphs, has been shown effective in several prior works. However, relatively little attention has been paid to the receptive fields of GNNs, which can be crucial in tasks with extremely long text that go beyond single sentences and require discourse analysis. In this work, we leverage the idea of graph pooling and propose the Mirror Graph Convolution Network (MrGCN), a GNN model with pooling-unpooling structures tailored to relation extraction. The pooling branch reduces the graph size and enables the GCN to obtain larger receptive fields within less layers; the unpooling branch restores the pooled graph to its original resolution such that token-level relation extraction can be performed. Experiments on two datasets demonstrate the effectiveness of our method, showing significant improvements over previous results.

READ FULL TEXT

page 2

page 8

page 9

research
02/02/2019

Graph Neural Networks with Generated Parameters for Relation Extraction

Recently, progress has been made towards improving relational reasoning ...
research
09/26/2018

Graph Convolution over Pruned Dependency Trees Improves Relation Extraction

Dependency trees help relation extraction models capture long-range rela...
research
03/15/2021

Mention-centered Graph Neural Network for Document-level Relation Extraction

Document-level relation extraction aims to discover relations between en...
research
01/11/2021

BERT-GT: Cross-sentence n-ary relation extraction with BERT and Graph Transformer

A biomedical relation statement is commonly expressed in multiple senten...
research
01/29/2022

Rewiring with Positional Encodings for Graph Neural Networks

Several recent works use positional encodings to extend the receptive fi...
research
04/18/2018

Sentences with Gapping: Parsing and Reconstructing Elided Predicates

Sentences with gapping, such as Paul likes coffee and Mary tea, lack an ...
research
02/28/2015

When Are Tree Structures Necessary for Deep Learning of Representations?

Recursive neural models, which use syntactic parse trees to recursively ...

Please sign up or login with your details

Forgot password? Click here to reset