Learning Long-Term Dependencies in Irregularly-Sampled Time Series

06/08/2020
by   Mathias Lechner, et al.
1

Recurrent neural networks (RNNs) with continuous-time hidden states are a natural fit for modeling irregularly-sampled time series. These models, however, face difficulties when the input data possess long-term dependencies. We prove that similar to standard RNNs, the underlying reason for this issue is the vanishing or exploding of the gradient during training. This phenomenon is expressed by the ordinary differential equation (ODE) representation of the hidden state, regardless of the ODE solver's choice. We provide a solution by designing a new algorithm based on the long short-term memory (LSTM) that separates its memory from its time-continuous state. This way, we encode a continuous-time dynamical flow within the RNN, allowing it to respond to inputs arriving at arbitrary time-lags while ensuring a constant error propagation through the memory path. We call these RNN models ODE-LSTMs. We experimentally show that ODE-LSTMs outperform advanced RNN-based counterparts on non-uniformly sampled data with long-term dependencies.

READ FULL TEXT
research
05/01/2021

RotLSTM: Rotating Memories in Recurrent Neural Networks

Long Short-Term Memory (LSTM) units have the ability to memorise and use...
research
02/26/2019

AntisymmetricRNN: A Dynamical System View on Recurrent Neural Networks

Recurrent neural networks have gained widespread use in modeling sequent...
research
06/25/2021

Closed-form Continuous-Depth Models

Continuous-depth neural models, where the derivative of the model's hidd...
research
05/16/2020

Achieving Online Regression Performance of LSTMs with Simple RNNs

Recurrent Neural Networks (RNNs) are widely used for online regression d...
research
03/29/2023

Learning Flow Functions from Data with Applications to Nonlinear Oscillators

We describe a recurrent neural network (RNN) based architecture to learn...
research
05/16/2017

Subregular Complexity and Deep Learning

This paper argues that the judicial use of formal language theory and gr...
research
10/25/2017

Benefits of Depth for Long-Term Memory of Recurrent Networks

The key attribute that drives the unprecedented success of modern Recurr...

Please sign up or login with your details

Forgot password? Click here to reset