Contextualized Non-local Neural Networks for Sequence Learning

by   Pengfei Liu, et al.
HEC Montréal
McGill University
The Ohio State University
FUDAN University

Recently, a large number of neural mechanisms and models have been proposed for sequence learning, of which self-attention, as exemplified by the Transformer model, and graph neural networks (GNNs) have attracted much attention. In this paper, we propose an approach that combines and draws on the complementary strengths of these two methods. Specifically, we propose contextualized non-local neural networks (CN^3), which can both dynamically construct a task-specific structure of a sentence and leverage rich local dependencies within a particular neighborhood. Experimental results on ten NLP tasks in text classification, semantic matching, and sequence labeling show that our proposed model outperforms competitive baselines and discovers task-specific dependency structures, thus providing better interpretability to users.


page 1

page 2

page 3

page 4


Non-Local Graph Neural Networks

Modern graph neural networks (GNNs) learn node embeddings through multil...

Spatial Bias for Attention-free Non-local Neural Networks

In this paper, we introduce the spatial bias to learn global knowledge w...

TENT: Text Classification Based on ENcoding Tree Learning

Text classification is a primary task in natural language processing (NL...

Conditional Self-Attention for Query-based Summarization

Self-attention mechanisms have achieved great success on a variety of NL...

Node Embedding using Mutual Information and Self-Supervision based Bi-level Aggregation

Graph Neural Networks (GNNs) learn low dimensional representations of no...

SCRAM: Spatially Coherent Randomized Attention Maps

Attention mechanisms and non-local mean operations in general are key in...

Self-Constructing Graph Convolutional Networks for Semantic Labeling

Graph Neural Networks (GNNs) have received increasing attention in many ...

Please sign up or login with your details

Forgot password? Click here to reset