Iterative evaluation of LSTM cells

07/11/2018
by   Leandro Palma, et al.
0

In this work we present a modification in the conventional flow of information through a LSTM network, which we consider well suited for RNNs in general. The modification leads to a iterative scheme where the computations performed by the LSTM cell are repeated over a constant input and cell state values, while updating the hidden state a finite number of times. We provide theoretical and empirical evidence to support the augmented capabilities of the iterative scheme and show examples related to language modeling. The modification yields an enhancement in the model performance comparable with the original model augmented more than 3 times in terms of the total amount of parameters.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/03/2017

Revisiting Activation Regularization for Language RNNs

Recurrent neural networks (RNNs) serve as a fundamental building block f...
research
08/20/2017

Neural Networks Compression for Language Modeling

In this paper, we consider several compression techniques for the langua...
research
10/29/2018

Counting in Language with RNNs

In this paper we examine a possible reason for the LSTM outperforming th...
research
08/31/2021

Working Memory Connections for LSTM

Recurrent Neural Networks with Long Short-Term Memory (LSTM) make use of...
research
08/04/2018

MCRM: Mother Compact Recurrent Memory

LSTMs and GRUs are the most common recurrent neural network architecture...
research
09/04/2018

Robust and parallel scalable iterative solutions for large-scale finite cell analyses

The finite cell method is a highly flexible discretization technique for...

Please sign up or login with your details

Forgot password? Click here to reset