Learning to Skim Text

04/23/2017
by   Adams Wei Yu, et al.
0

Recurrent Neural Networks are showing much promise in many sub-areas of natural language processing, ranging from document classification to machine translation to automatic question answering. Despite their promise, many recurrent models have to read the whole text word by word, making it slow to handle long documents. For example, it is difficult to use a recurrent network to read a book and answer questions about it. In this paper, we present an approach of reading text while skipping irrelevant information if needed. The underlying model is a recurrent network that learns how far to jump after reading a few words of the input text. We employ a standard policy gradient method to train the model to make discrete jumping decisions. In our benchmarks on four different tasks, including number prediction, sentiment analysis, news article classification and automatic Q&A, our proposed model, a modified LSTM with jumping, is up to 6 times faster than the standard sequential LSTM, while maintaining the same or even better accuracy.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/28/2019

Leap-LSTM: Enhancing Long Short-Term Memory for Text Categorization

Recurrent Neural Networks (RNNs) are widely used in the field of natural...
research
03/20/2019

Neural Speed Reading with Structural-Jump-LSTM

Recurrent neural networks (RNNs) can model natural language by sequentia...
research
07/14/2016

Using Recurrent Neural Network for Learning Expressive Ontologies

Recently, Neural Networks have been proven extremely effective in many n...
research
05/04/2017

Machine Comprehension by Text-to-Text Neural Question Generation

We propose a recurrent neural model that generates natural-language ques...
research
02/24/2021

Multichannel LSTM-CNN for Telugu Technical Domain Identification

With the instantaneous growth of text information, retrieving domain-ori...
research
07/14/2016

Neural Semantic Encoders

We present a memory augmented neural network for natural language unders...
research
02/29/2020

Depth-Adaptive Graph Recurrent Network for Text Classification

The Sentence-State LSTM (S-LSTM) is a powerful and high efficient graph ...

Please sign up or login with your details

Forgot password? Click here to reset