Contextualized Non-local Neural Networks for Sequence Learning

11/21/2018
by   Pengfei Liu, et al.
0

Recently, a large number of neural mechanisms and models have been proposed for sequence learning, of which self-attention, as exemplified by the Transformer model, and graph neural networks (GNNs) have attracted much attention. In this paper, we propose an approach that combines and draws on the complementary strengths of these two methods. Specifically, we propose contextualized non-local neural networks (CN^3), which can both dynamically construct a task-specific structure of a sentence and leverage rich local dependencies within a particular neighborhood. Experimental results on ten NLP tasks in text classification, semantic matching, and sequence labeling show that our proposed model outperforms competitive baselines and discovers task-specific dependency structures, thus providing better interpretability to users.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/29/2020

Non-Local Graph Neural Networks

Modern graph neural networks (GNNs) learn node embeddings through multil...
research
02/24/2023

Spatial Bias for Attention-free Non-local Neural Networks

In this paper, we introduce the spatial bias to learn global knowledge w...
research
10/05/2021

TENT: Text Classification Based on ENcoding Tree Learning

Text classification is a primary task in natural language processing (NL...
research
02/18/2020

Conditional Self-Attention for Query-based Summarization

Self-attention mechanisms have achieved great success on a variety of NL...
research
04/27/2021

Node Embedding using Mutual Information and Self-Supervision based Bi-level Aggregation

Graph Neural Networks (GNNs) learn low dimensional representations of no...
research
05/24/2019

SCRAM: Spatially Coherent Randomized Attention Maps

Attention mechanisms and non-local mean operations in general are key in...
research
03/15/2020

Self-Constructing Graph Convolutional Networks for Semantic Labeling

Graph Neural Networks (GNNs) have received increasing attention in many ...

Please sign up or login with your details

Forgot password? Click here to reset