Learning Over Long Time Lags

02/13/2016
by   Hojjat Salehinejad, et al.
0

The advantage of recurrent neural networks (RNNs) in learning dependencies between time-series data has distinguished RNNs from other deep learning models. Recently, many advances are proposed in this emerging field. However, there is a lack of comprehensive review on memory models in RNNs in the literature. This paper provides a fundamental review on RNNs and long short term memory (LSTM) model. Then, provides a surveys of recent advances in different memory enhancements and learning techniques for capturing long term dependencies in RNNs.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/29/2017

Recent Advances in Recurrent Neural Networks

Recurrent neural networks (RNNs) are capable of learning features and lo...
research
07/09/2020

Long Short-Term Memory Spiking Networks and Their Applications

Recent advances in event-based neuromorphic systems have resulted in sig...
research
05/16/2017

Subregular Complexity and Deep Learning

This paper argues that the judicial use of formal language theory and gr...
research
02/24/2017

Analyzing and Exploiting NARX Recurrent Neural Networks for Long-Term Dependencies

Recurrent neural networks (RNNs) have achieved state-of-the-art performa...
research
11/28/2022

Regional Precipitation Nowcasting Based on CycleGAN Extension

Unusually, intensive heavy rain hit the central region of Korea on Augus...
research
09/26/2016

Automatic Construction of a Recurrent Neural Network based Classifier for Vehicle Passage Detection

Recurrent Neural Networks (RNNs) are extensively used for time-series mo...
research
08/24/2018

Memory Time Span in LSTMs for Multi-Speaker Source Separation

With deep learning approaches becoming state-of-the-art in many speech (...

Please sign up or login with your details

Forgot password? Click here to reset