Variational Bi-LSTMs

11/15/2017
by   Samira Shabanian, et al.
0

Recurrent neural networks like long short-term memory (LSTM) are important architectures for sequential prediction tasks. LSTMs (and RNNs in general) model sequences along the forward time direction. Bidirectional LSTMs (Bi-LSTMs) on the other hand model sequences along both forward and backward directions and are generally known to perform better at such tasks because they capture a richer representation of the data. In the training of Bi-LSTMs, the forward and backward paths are learned independently. We propose a variant of the Bi-LSTM architecture, which we call Variational Bi-LSTM, that creates a channel between the two paths (during training, but which may be omitted during inference); thus optimizing the two paths jointly. We arrive at this joint objective for our model by minimizing a variational lower bound of the joint likelihood of the data sequence. Our model acts as a regularizer and encourages the two networks to inform each other in making their respective predictions using distinct information. We perform ablation studies to better understand the different components of our model and evaluate the method on various benchmarks, showing state-of-the-art performance.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/19/2016

Multilingual Part-of-Speech Tagging with Bidirectional Long Short-Term Memory Models and Auxiliary Loss

Bidirectional long short-term memory (bi-LSTM) networks have recently pr...
research
01/11/2016

Investigating gated recurrent neural networks for speech synthesis

Recently, recurrent neural networks (RNNs) as powerful sequence models h...
research
12/29/2018

SLIM LSTMs

Long Short-Term Memory (LSTM) Recurrent Neural networks (RNNs) rely on g...
research
12/24/2020

Pain Assessment based on fNIRS using Bidirectional LSTMs

Assessing pain in patients unable to speak (also called non-verbal patie...
research
11/21/2016

Learning From Graph Neighborhoods Using LSTMs

Many prediction problems can be phrased as inferences over local neighbo...
research
11/06/2018

Evaluating the Ability of LSTMs to Learn Context-Free Grammars

While long short-term memory (LSTM) neural net architectures are designe...
research
01/25/2019

Dynamical Isometry and a Mean Field Theory of LSTMs and GRUs

Training recurrent neural networks (RNNs) on long sequence tasks is plag...

Please sign up or login with your details

Forgot password? Click here to reset