Improving speech recognition by revising gated recurrent units

09/29/2017
by   Mirco Ravanelli, et al.
0

Speech recognition is largely taking advantage of deep learning, showing that substantial benefits can be obtained by modern Recurrent Neural Networks (RNNs). The most popular RNNs are Long Short-Term Memory (LSTMs), which typically reach state-of-the-art performance in many tasks thanks to their ability to learn long-term dependencies and robustness to vanishing gradients. Nevertheless, LSTMs have a rather complex design with three multiplicative gates, that might impair their efficient implementation. An attempt to simplify LSTMs has recently led to Gated Recurrent Units (GRUs), which are based on just two multiplicative gates. This paper builds on these efforts by further revising GRUs and proposing a simplified architecture potentially more suitable for speech recognition. The contribution of this work is two-fold. First, we suggest to remove the reset gate in the GRU design, resulting in a more efficient single-gate architecture. Second, we propose to replace tanh with ReLU activations in the state update equations. Results show that, in our implementation, the revised architecture reduces the per-epoch training time with more than 30 improves recognition performance across different tasks, input features, and noisy conditions when compared to a standard GRU.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/26/2018

Light Gated Recurrent Units for Speech Recognition

A field that has directly benefited from the recent advances in deep lea...
research
01/11/2016

Investigating gated recurrent neural networks for speech synthesis

Recently, recurrent neural networks (RNNs) as powerful sequence models h...
research
10/30/2015

Highway Long Short-Term Memory RNNs for Distant Speech Recognition

In this paper, we extend the deep long short-term memory (DLSTM) recurre...
research
05/09/2018

Long Short-Term Memory as a Dynamically Computed Element-wise Weighted Sum

LSTMs were introduced to combat vanishing gradients in simple RNNs by au...
research
01/31/2020

Gating creates slow modes and controls phase-space complexity in GRUs and LSTMs

Recurrent neural networks (RNNs) are powerful dynamical models for data ...
research
12/15/2016

Tunable Efficient Unitary Neural Networks (EUNN) and their application to RNNs

Using unitary (instead of general) matrices in artificial neural network...
research
06/14/2023

Permutation Invariant Recurrent Neural Networks for Sound Source Tracking Applications

Many multi-source localization and tracking models based on neural netwo...

Please sign up or login with your details

Forgot password? Click here to reset