A recurrent neural network without chaos

12/19/2016
by   Thomas Laurent, et al.
0

We introduce an exceptionally simple gated recurrent neural network (RNN) that achieves performance comparable to well-known gated architectures, such as LSTMs and GRUs, on the word-level language modeling task. We prove that our model has simple, predicable and non-chaotic dynamics. This stands in stark contrast to more standard gated architectures, whose underlying dynamical systems exhibit chaotic behavior.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/10/2021

Recurrent neural network-based Internal Model Control of unknown nonlinear stable systems

Owing to their superior modeling capabilities, gated Recurrent Neural Ne...
research
03/01/2017

The Statistical Recurrent Unit

Sophisticated gated recurrent neural network architectures like LSTMs an...
research
11/18/2017

MinimalRNN: Toward More Interpretable and Trainable Recurrent Neural Networks

We introduce MinimalRNN, a new recurrent neural network architecture tha...
research
03/05/2019

Gated Graph Convolutional Recurrent Neural Networks

Graph processes model a number of important problems such as identifying...
research
09/17/2017

Hierarchical Gated Recurrent Neural Tensor Network for Answer Triggering

In this paper, we focus on the problem of answer triggering ad-dressed b...
research
08/02/2022

Analog Gated Recurrent Neural Network for Detecting Chewing Events

We present a novel gated recurrent neural network to detect when a perso...
research
05/25/2017

Predictive State Recurrent Neural Networks

We present a new model, Predictive State Recurrent Neural Networks (PSRN...

Please sign up or login with your details

Forgot password? Click here to reset