A Way out of the Odyssey: Analyzing and Combining Recent Insights for LSTMs

11/16/2016
by   Shayne Longpre, et al.
0

LSTMs have become a basic building block for many deep NLP models. In recent years, many improvements and variations have been proposed for deep sequence models in general, and LSTMs in particular. We propose and analyze a series of augmentations and modifications to LSTM networks resulting in improved performance for text classification datasets. We observe compounding improvements on traditional LSTMs using Monte Carlo test-time model averaging, average pooling, and residual connections, along with four other suggested modifications. Our analysis provides a simple, reliable, and high quality baseline model.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/05/2017

Bayesian LSTMs in medicine

The medical field stands to see significant benefits from the recent adv...
research
06/06/2016

Very Deep Convolutional Networks for Text Classification

The dominant approach for many NLP tasks are recurrent neural networks, ...
research
12/18/2018

A Comparison of LSTMs and Attention Mechanisms for Forecasting Financial Time Series

While LSTMs show increasingly promising results for forecasting Financia...
research
11/09/2018

EA-LSTM: Evolutionary Attention-based LSTM for Time Series Prediction

Time series prediction with deep learning methods, especially long short...
research
04/30/2017

Revisiting Recurrent Networks for Paraphrastic Sentence Embeddings

We consider the problem of learning general-purpose, paraphrastic senten...
research
12/23/2022

Simple Yet Surprisingly Effective Training Strategies for LSTMs in Sensor-Based Human Activity Recognition

Human Activity Recognition (HAR) is one of the core research areas in mo...
research
02/07/2017

Fast and Accurate Entity Recognition with Iterated Dilated Convolutions

Today when many practitioners run basic NLP on the entire web and large-...

Please sign up or login with your details

Forgot password? Click here to reset