Feedforward Sequential Memory Networks: A New Structure to Learn Long-term Dependency

12/28/2015
by   Shiliang Zhang, et al.
0

In this paper, we propose a novel neural network structure, namely feedforward sequential memory networks (FSMN), to model long-term dependency in time series without using recurrent feedback. The proposed FSMN is a standard fully-connected feedforward neural network equipped with some learnable memory blocks in its hidden layers. The memory blocks use a tapped-delay line structure to encode the long context information into a fixed-size representation as short-term memory mechanism. We have evaluated the proposed FSMNs in several standard benchmark tasks, including speech recognition and language modelling. Experimental results have shown FSMNs significantly outperform the conventional recurrent neural networks (RNN), including LSTMs, in modeling sequential signals like speech or language. Moreover, FSMNs can be learned much more reliably and faster than RNNs or LSTMs due to the inherent non-recurrent model structure.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/09/2015

Feedforward Sequential Memory Neural Networks without Recurrent Feedback

We introduce a new structure for memory neural networks, called feedforw...
research
04/30/2016

Higher Order Recurrent Neural Networks

In this paper, we study novel neural network structures to better model ...
research
06/29/2020

Incremental Training of a Recurrent Neural Network Exploiting a Multi-Scale Dynamic Memory

The effectiveness of recurrent neural networks can be largely influenced...
research
11/03/2022

An Improved Time Feedforward Connections Recurrent Neural Networks

Recurrent Neural Networks (RNNs) have been widely applied to deal with t...
research
05/24/2017

Stochastic Sequential Neural Networks with Structured Inference

Unsupervised structure learning in high-dimensional time series data has...
research
01/05/2021

Recurrent Neural Networks for Stochastic Control Problems with Delay

Stochastic control problems with delay are challenging due to the path-d...
research
07/09/2018

On Training Recurrent Networks with Truncated Backpropagation Through Time in Speech Recognition

Recurrent neural networks have been the dominant models for many speech ...

Please sign up or login with your details

Forgot password? Click here to reset