Feedforward Sequential Memory Neural Networks without Recurrent Feedback

10/09/2015
by   Shiliang Zhang, et al.
0

We introduce a new structure for memory neural networks, called feedforward sequential memory networks (FSMN), which can learn long-term dependency without using recurrent feedback. The proposed FSMN is a standard feedforward neural networks equipped with learnable sequential memory blocks in the hidden layers. In this work, we have applied FSMN to several language modeling (LM) tasks. Experimental results have shown that the memory blocks in FSMN can learn effective representations of long history. Experiments have shown that FSMN based language models can significantly outperform not only feedforward neural network (FNN) based LMs but also the popular recurrent neural network (RNN) LMs.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/28/2015

Feedforward Sequential Memory Networks: A New Structure to Learn Long-term Dependency

In this paper, we propose a novel neural network structure, namely feedf...
research
04/20/2023

Observer-Feedback-Feedforward Controller Structures in Reinforcement Learning

The paper proposes the use of structured neural networks for reinforceme...
research
11/08/2018

Linear Memory Networks

Recurrent neural networks can learn complex transduction problems that r...
research
08/22/2016

Surprisal-Driven Feedback in Recurrent Networks

Recurrent neural nets are widely used for predicting temporal data. Thei...
research
05/24/2019

On Recurrent Neural Networks for Sequence-based Processing in Communications

In this work, we analyze the capabilities and practical limitations of n...
research
08/06/2022

Learning Human Cognitive Appraisal Through Reinforcement Memory Unit

We propose a novel memory-enhancing mechanism for recurrent neural netwo...

Please sign up or login with your details

Forgot password? Click here to reset