Attentive Convolution

10/02/2017
by   Wenpeng Yin, et al.
0

In NLP, convolution neural networks (CNNs) have benefited less than recurrent neural networks (RNNs) from attention mechanisms. We hypothesize that this is because attention in CNNs has been mainly implemented as attentive pooling (i.e., it is applied to pooling) rather than as attentive convolution (i.e., it is integrated into convolution). Convolution is the differentiator of CNNs in that it can powerfully model the higher-level representation of a word by taking into account its local fixed-size context in input text t^x. In this work, we propose an attentive convolution network, AttentiveConvNet. It extends the context scope of the convolution operation, deriving higher-level features for a word not only from local context, but also from information extracted from nonlocal context by the attention mechanism commonly used in RNNs. This nonlocal context can come (i) from parts of the input text t^x that are distant or (ii) from a second input text, the context text t^y. In an evaluation on sentence relation classification (textual entailment and answer sentence selection) and text classification, experiments demonstrate that AttentiveConvNet has state-of-the-art performance and outperforms RNN/CNN variants with and without attention.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/11/2016

Attentive Pooling Networks

In this work, we propose Attentive Pooling (AP), a two-way attention mec...
research
07/15/2016

Neural Tree Indexers for Text Understanding

Recurrent neural networks (RNNs) process input text sequentially and mod...
research
08/28/2018

Convolutional Neural Networks with Recurrent Neural Filters

We introduce a class of convolutional neural networks (CNNs) that utiliz...
research
08/26/2018

Semantic-Unit-Based Dilated Convolution for Multi-Label Text Classification

We propose a novel model for multi-label text classification, which is b...
research
07/05/2017

Multiple Range-Restricted Bidirectional Gated Recurrent Units with Attention for Relation Classification

Most of neural approaches to relation classification have focused on fin...
research
04/24/2018

End-Task Oriented Textual Entailment via Deep Exploring Inter-Sentence Interactions

This work deals with SciTail, a natural entailment problem derived from ...
research
03/15/2016

Multichannel Variable-Size Convolution for Sentence Classification

We propose MVCNN, a convolution neural network (CNN) architecture for se...

Please sign up or login with your details

Forgot password? Click here to reset