Text Classification Improved by Integrating Bidirectional LSTM with Two-dimensional Max Pooling

11/21/2016
by   Peng Zhou, et al.
0

Recurrent Neural Network (RNN) is one of the most popular architectures used in Natural Language Processsing (NLP) tasks because its recurrent structure is very suitable to process variable-length text. RNN can utilize distributed representations of words by first converting the tokens comprising each text into vectors, which form a matrix. And this matrix includes two dimensions: the time-step dimension and the feature vector dimension. Then most existing models usually utilize one-dimensional (1D) max pooling operation or attention-based operation only on the time-step dimension to obtain a fixed-length vector. However, the features on the feature vector dimension are not mutually independent, and simply applying 1D pooling operation over the time-step dimension independently may destroy the structure of the feature representation. On the other hand, applying two-dimensional (2D) pooling operation over the two dimensions may sample more meaningful features for sequence modeling tasks. To integrate the features on both dimensions of the matrix, this paper explores applying 2D max pooling operation to obtain a fixed-length representation of the text. This paper also utilizes 2D convolution to sample more meaningful information of the matrix. Experiments are conducted on six text classification tasks, including sentiment analysis, question classification, subjectivity classification and newsgroup classification. Compared with the state-of-the-art models, the proposed models achieve excellent performance on 4 out of 6 tasks. Specifically, one of the proposed models achieves highest accuracy on Stanford Sentiment Treebank binary classification and fine-grained classification tasks.

READ FULL TEXT

page 4

page 9

research
08/30/2019

Sequential Learning of Convolutional Features for Effective Text Classification

Text classification has been one of the major problems in natural langua...
research
11/27/2015

A C-LSTM Neural Network for Text Classification

Neural network models have been demonstrated to be capable of achieving ...
research
05/01/2020

Why and when should you pool? Analyzing Pooling in Recurrent Architectures

Pooling-based recurrent neural architectures consistently outperform the...
research
06/19/2016

Full-Time Supervision based Bidirectional RNN for Factoid Question Answering

Recently, bidirectional recurrent neural network (BRNN) has been widely ...
research
11/11/2019

Text classification with pixel embedding

We propose a novel framework to understand the text by converting senten...
research
06/30/2020

A Data-driven Neural Network Architecture for Sentiment Analysis

The fabulous results of convolution neural networks in image-related tas...
research
06/24/2022

A multi-model-based deep learning framework for short text multiclass classification with the imbalanced and extremely small data set

Text classification plays an important role in many practical applicatio...

Please sign up or login with your details

Forgot password? Click here to reset