Approximate Fixed-Points in Recurrent Neural Networks

06/04/2021
by   Zhengxiong Wang, et al.
0

Recurrent neural networks are widely used in speech and language processing. Due to dependency on the past, standard algorithms for training these models, such as back-propagation through time (BPTT), cannot be efficiently parallelised. Furthermore, applying these models to more complex structures than sequences requires inference time approximations, which introduce inconsistency between inference and training. This paper shows that recurrent neural networks can be reformulated as fixed-points of non-linear equation systems. These fixed-points can be computed using an iterative algorithm exactly and in as many iterations as the length of any given sequence. Each iteration of this algorithm adds one additional Markovian-like order of dependencies such that upon termination all dependencies modelled by the recurrent neural networks have been incorporated. Although exact fixed-points inherit the same parallelization and inconsistency issues, this paper shows that approximate fixed-points can be computed in parallel and used consistently in training and inference including tasks such as lattice rescoring. Experimental validation is performed in two tasks, Penn Tree Bank and WikiText-2, and shows that approximate fixed-points yield competitive prediction performance to recurrent neural networks trained using the BPTT algorithm.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/25/2018

When Recurrent Models Don't Need To Be Recurrent

We prove stable recurrent neural networks are well approximated by feed-...
research
06/03/2019

A detailed study of recurrent neural networks used to model tasks in the cerebral cortex

We studied the properties of simple recurrent neural networks trained to...
research
11/26/2019

An Optimized and Energy-Efficient Parallel Implementation of Non-Iteratively Trained Recurrent Neural Networks

Recurrent neural networks (RNN) have been successfully applied to variou...
research
10/17/2018

Continual Learning of Recurrent Neural Networks by Locally Aligning Distributed Representations

Temporal models based on recurrent neural networks have proven to be qui...
research
07/31/2015

Artificial Neural Networks Applied to Taxi Destination Prediction

We describe our first-place solution to the ECML/PKDD discovery challeng...
research
10/17/2018

Online Learning of Recurrent Neural Architectures by Locally Aligning Distributed Representations

Temporal models based on recurrent neural networks have proven to be qui...
research
08/29/2016

About Learning in Recurrent Bistable Gradient Networks

Recurrent Bistable Gradient Networks are attractor based neural networks...

Please sign up or login with your details

Forgot password? Click here to reset