Multichannel CNN with Attention for Text Classification

06/29/2020
by   Zhenyu Liu, et al.
0

Recent years, the approaches based on neural networks have shown remarkable potential for sentence modeling. There are two main neural network structures: recurrent neural network (RNN) and convolution neural network (CNN). RNN can capture long term dependencies and store the semantics of the previous information in a fixed-sized vector. However, RNN is a biased model and its ability to extract global semantics is restricted by the fixed-sized vector. Alternatively, CNN is able to capture n-gram features of texts by utilizing convolutional filters. But the width of convolutional filters restricts its performance. In order to combine the strengths of the two kinds of networks and alleviate their shortcomings, this paper proposes Attention-based Multichannel Convolutional Neural Network (AMCNN) for text classification. AMCNN utilizes a bi-directional long short-term memory to encode the history and future information of words into high dimensional representations, so that the information of both the front and back of the sentence can be fully expressed. Then the scalar attention and vectorial attention are applied to obtain multichannel representations. The scalar attention can calculate the word-level importance and the vectorial attention can calculate the feature-level importance. In the classification task, AMCNN uses a CNN structure to cpture word relations on the representations generated by the scalar and vectorial attention mechanism instead of calculating the weighted sums. It can effectively extract the n-gram features of the text. The experimental results on the benchmark datasets demonstrate that AMCNN achieves better performance than state-of-the-art methods. In addition, the visualization results verify the semantic richness of multichannel representations.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/27/2015

A C-LSTM Neural Network for Text Classification

Neural network models have been demonstrated to be capable of achieving ...
research
06/29/2020

Combine Convolution with Recurrent Networks for Text Classification

Convolutional neural network (CNN) and recurrent neural network (RNN) ar...
research
08/12/2020

Text Classification based on Multi-granularity Attention Hybrid Neural Network

Neural network-based approaches have become the driven forces for Natura...
research
03/22/2017

Hierarchical RNN with Static Sentence-Level Attention for Text-Based Speaker Change Detection

Traditional speaker change detection in dialogues is typically based on ...
research
11/11/2019

Text classification with pixel embedding

We propose a novel framework to understand the text by converting senten...
research
02/09/2020

Short Text Classification via Knowledge powered Attention with Similarity Matrix based CNN

Short text is becoming more and more popular on the web, such as Chat Me...
research
05/03/2018

Transformation Networks for Target-Oriented Sentiment Classification

Target-oriented sentiment classification aims at classifying sentiment p...

Please sign up or login with your details

Forgot password? Click here to reset