Layer Flexible Adaptive Computational Time for Recurrent Neural Networks

12/06/2018
by   Lida Zhang, et al.
12

Deep recurrent neural networks show significant benefits in prediction tasks, but a universal problem is deciding the number of layers for the network, especially considering the different computation requests for tasks of different difficulties. We propose a layer flexible recurrent neural network with adaptive computational time, and expand it to sequence to sequence model with teacher-forcing based input policies. The model applies the attention mechanism on all the computation rounds in each step to structure a transmission state for each layer individually in the next step. We evaluate the model on the problem of tick prices prediction. Experimental results show the performance improvement and indicate the model's ability to dynamically change the number of layers.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/29/2017

Gradual Learning of Deep Recurrent Neural Networks

Deep Recurrent Neural Networks (RNNs) achieve state-of-the-art results i...
research
07/02/2018

Dynamic Prediction Length for Time Series with Sequence to Sequence Networks

Recurrent neural networks and sequence to sequence models require a pred...
research
03/29/2016

Adaptive Computation Time for Recurrent Neural Networks

This paper introduces Adaptive Computation Time (ACT), an algorithm that...
research
12/07/2021

CCasGNN: Collaborative Cascade Prediction Based on Graph Neural Networks

Cascade prediction aims at modeling information diffusion in the network...
research
10/22/2019

Depth-Adaptive Transformer

State of the art sequence-to-sequence models perform a fixed number of c...
research
03/21/2018

Comparing Fixed and Adaptive Computation Time for Recurrent Neural Networks

Adaptive Computation Time for Recurrent Neural Networks (ACT) is one of ...
research
07/21/2023

On the Universality of Linear Recurrences Followed by Nonlinear Projections

In this note (work in progress towards a full-length paper) we show that...

Please sign up or login with your details

Forgot password? Click here to reset