RotLSTM: Rotating Memories in Recurrent Neural Networks

05/01/2021
by   Vlad Velici, et al.
0

Long Short-Term Memory (LSTM) units have the ability to memorise and use long-term dependencies between inputs to generate predictions on time series data. We introduce the concept of modifying the cell state (memory) of LSTMs using rotation matrices parametrised by a new set of trainable weights. This addition shows significant increases of performance on some of the tasks from the bAbI dataset.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/03/2017

Time Series Forecasting Based on Augmented Long Short-Term Memory

In this paper, we use recurrent autoencoder model to predict the time se...
research
06/08/2020

Learning Long-Term Dependencies in Irregularly-Sampled Time Series

Recurrent neural networks (RNNs) with continuous-time hidden states are ...
research
08/31/2022

ARMA Cell: A Modular and Effective Approach for Neural Autoregressive Modeling

The autoregressive moving average (ARMA) model is a classical, and argua...
research
12/08/2014

Cells in Multidimensional Recurrent Neural Networks

The transcription of handwritten text on images is one task in machine l...
research
09/27/2018

Using Autoencoders To Learn Interesting Features For Detecting Surveillance Aircraft

This paper explores using a Long short-term memory (LSTM) based sequence...
research
10/06/2018

h-detach: Modifying the LSTM Gradient Towards Better Optimization

Recurrent neural networks are known for their notorious exploding and va...
research
06/10/2015

Generative Image Modeling Using Spatial LSTMs

Modeling the distribution of natural images is challenging, partly becau...

Please sign up or login with your details

Forgot password? Click here to reset