A comparison of LSTM and GRU networks for learning symbolic sequences

07/05/2021
by   Roberto Cahuantzi, et al.
0

We explore relations between the hyper-parameters of a recurrent neural network (RNN) and the complexity of string sequences it is able to memorize. We compare long short-term memory (LSTM) networks and gated recurrent units (GRUs). We find that an increase of RNN depth does not necessarily result in better memorization capability when the training time is constrained. Our results also indicate that the learning rate and the number of units per layer are among the most important hyper-parameters to be tuned. Generally, GRUs outperform LSTM networks on low complexity sequences while on high complexity sequences LSTMs perform better.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/22/2023

Recurrent Neural Networks and Long Short-Term Memory Networks: Tutorial and Survey

This is a tutorial paper on Recurrent Neural Network (RNN), Long Short-T...
research
01/18/2019

Slim LSTM networks: LSTM_6 and LSTM_C6

We have shown previously that our parameter-reduced variants of Long Sho...
research
08/16/2015

Depth-Gated LSTM

In this short note, we present an extension of long short-term memory (L...
research
09/11/2018

DeepProteomics: Protein family classification using Shallow and Deep Networks

The knowledge regarding the function of proteins is necessary as it give...
research
09/23/2021

LSTM Hyper-Parameter Selection for Malware Detection: Interaction Effects and Hierarchical Selection Approach

Long-Short-Term-Memory (LSTM) networks have shown great promise in artif...
research
07/26/2018

Towards a Deep Unified Framework for Nuclear Reactor Perturbation Analysis

This paper proposes the first step towards a novel unified framework for...
research
12/26/2020

Assessment of the Relative Importance of different hyper-parameters of LSTM for an IDS

Recurrent deep learning language models like the LSTM are often used to ...

Please sign up or login with your details

Forgot password? Click here to reset