High Order Recurrent Neural Networks for Acoustic Modelling

02/22/2018
by   Chao Zhang, et al.
0

Vanishing long-term gradients are a major issue in training standard recurrent neural networks (RNNs), which can be alleviated by long short-term memory (LSTM) models with memory cells. However, the extra parameters associated with the memory cells mean an LSTM layer has four times as many parameters as an RNN with the same hidden vector size. This paper addresses the vanishing gradient problem using a high order RNN (HORNN) which has additional connections from multiple previous time steps. Speech recognition experiments using British English multi-genre broadcast (MGB3) data showed that the proposed HORNN architectures for rectified linear unit and sigmoid activation functions reduced word error rates (WER) by 4.2 corresponding RNNs, and gave similar WERs to a (projected) LSTM while using only 20

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/05/2014

Long Short-Term Memory Based Recurrent Neural Network Architectures for Large Vocabulary Speech Recognition

Long Short-Term Memory (LSTM) is a recurrent neural network (RNN) archit...
research
06/18/2018

Semi-tied Units for Efficient Gating in LSTM and Highway Networks

Gating is a key technique used for integrating information from multiple...
research
01/22/2019

Towards Non-saturating Recurrent Units for Modelling Long-term Dependencies

Modelling long-term dependencies is a challenge for recurrent neural net...
research
11/25/2019

Gating Revisited: Deep Multi-layer RNNs That Can Be Trained

We propose a new stackable recurrent cell (STAR) for recurrent neural ne...
research
07/20/2020

DiffRNN: Differential Verification of Recurrent Neural Networks

Recurrent neural networks (RNNs) such as Long Short Term Memory (LSTM) n...
research
01/14/2017

Long Timescale Credit Assignment in NeuralNetworks with External Memory

Credit assignment in traditional recurrent neural networks usually invol...
research
10/11/2019

Deep Independently Recurrent Neural Network (IndRNN)

Recurrent neural networks (RNNs) are known to be difficult to train due ...

Please sign up or login with your details

Forgot password? Click here to reset