An Improved Time Feedforward Connections Recurrent Neural Networks

11/03/2022
by   Jin Wang, et al.
0

Recurrent Neural Networks (RNNs) have been widely applied to deal with temporal problems, such as flood forecasting and financial data processing. On the one hand, traditional RNNs models amplify the gradient issue due to the strict time serial dependency, making it difficult to realize a long-term memory function. On the other hand, RNNs cells are highly complex, which will significantly increase computational complexity and cause waste of computational resources during model training. In this paper, an improved Time Feedforward Connections Recurrent Neural Networks (TFC-RNNs) model was first proposed to address the gradient issue. A parallel branch was introduced for the hidden state at time t-2 to be directly transferred to time t without the nonlinear transformation at time t-1. This is effective in improving the long-term dependence of RNNs. Then, a novel cell structure named Single Gate Recurrent Unit (SGRU) was presented. This cell structure can reduce the number of parameters for RNNs cell, consequently reducing the computational complexity. Next, applying SGRU to TFC-RNNs as a new TFC-SGRU model solves the above two difficulties. Finally, the performance of our proposed TFC-SGRU was verified through several experiments in terms of long-term memory and anti-interference capabilities. Experimental results demonstrated that our proposed TFC-SGRU model can capture helpful information with time step 1500 and effectively filter out the noise. The TFC-SGRU model accuracy is better than the LSTM and GRU models regarding language processing ability.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/28/2020

Learning Various Length Dependence by Dual Recurrent Neural Networks

Recurrent neural networks (RNNs) are widely used as a memory model for s...
research
07/05/2023

A Versatile Hub Model For Efficient Information Propagation And Feature Selection

Hub structure, characterized by a few highly interconnected nodes surrou...
research
12/28/2015

Feedforward Sequential Memory Networks: A New Structure to Learn Long-term Dependency

In this paper, we propose a novel neural network structure, namely feedf...
research
02/26/2016

Architectural Complexity Measures of Recurrent Neural Networks

In this paper, we systematically analyze the connecting architectures of...
research
08/10/2023

ReLU and Addition-based Gated RNN

We replace the multiplication and sigmoid function of the conventional r...
research
02/22/2023

Learning from Predictions: Fusing Training and Autoregressive Inference for Long-Term Spatiotemporal Forecasts

Recurrent Neural Networks (RNNs) have become an integral part of modelin...
research
05/23/2018

Highway State Gating for Recurrent Highway Networks: improving information flow through time

Recurrent Neural Networks (RNNs) play a major role in the field of seque...

Please sign up or login with your details

Forgot password? Click here to reset