Achieving Online Regression Performance of LSTMs with Simple RNNs

05/16/2020
by   N. Mert Vural, et al.
0

Recurrent Neural Networks (RNNs) are widely used for online regression due to their ability to generalize nonlinear temporal dependencies. As an RNN model, Long-Short-Term-Memory Networks (LSTMs) are commonly preferred in practice, as these networks are capable of learning long-term dependencies while avoiding the vanishing gradient problem. However, due to their large number of parameters, training LSTMs requires considerably longer training time compared to simple RNNs (SRNNs). In this paper, we achieve the online regression performance of LSTMs with SRNNs efficiently. To this end, we introduce a first-order training algorithm with a linear time complexity in the number of parameters. We show that when SRNNs are trained with our algorithm, they provide very similar regression performance with the LSTMs in two to three times shorter training time. We provide strong theoretical analysis to support our experimental results by providing regret bounds on the converge rate of our algorithm. Through an extensive set of experiments, we verify our theoretical work and demonstrate significant performance improvements of our algorithm with respect to LSTMs and the state-of-the-art training methods.

READ FULL TEXT
research
05/16/2017

Subregular Complexity and Deep Learning

This paper argues that the judicial use of formal language theory and gr...
research
06/08/2020

Learning Long-Term Dependencies in Irregularly-Sampled Time Series

Recurrent neural networks (RNNs) with continuous-time hidden states are ...
research
03/07/2020

RNN-based Online Learning: An Efficient First-Order Optimization Algorithm with a Convergence Guarantee

We investigate online nonlinear regression with continually running recu...
research
04/09/2021

DeepSITH: Efficient Learning via Decomposition of What and When Across Time Scales

Extracting temporal relationships over a range of scales is a hallmark o...
research
03/08/2021

PyRCN: Exploration and Application of ESNs

As a family member of Recurrent Neural Networks and similar to Long-Shor...
research
11/29/2022

Exploring the Long-Term Generalization of Counting Behavior in RNNs

In this study, we investigate the generalization of LSTM, ReLU and GRU m...
research
09/28/2018

Learning Recurrent Binary/Ternary Weights

Recurrent neural networks (RNNs) have shown excellent performance in pro...

Please sign up or login with your details

Forgot password? Click here to reset