A recurrent neural network without chaos

12/19/2016
by   Thomas Laurent, et al.
0

We introduce an exceptionally simple gated recurrent neural network (RNN) that achieves performance comparable to well-known gated architectures, such as LSTMs and GRUs, on the word-level language modeling task. We prove that our model has simple, predicable and non-chaotic dynamics. This stands in stark contrast to more standard gated architectures, whose underlying dynamical systems exhibit chaotic behavior.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset