Stabilising and accelerating light gated recurrent units for automatic speech recognition

02/16/2023
by   Adel Moumen, et al.
8

The light gated recurrent units (Li-GRU) is well-known for achieving impressive results in automatic speech recognition (ASR) tasks while being lighter and faster to train than a standard gated recurrent units (GRU). However, the unbounded nature of its rectified linear unit on the candidate recurrent gate induces an important gradient exploding phenomenon disrupting the training process and preventing it from being applied to famous datasets. In this paper, we theoretically and empirically derive the necessary conditions for its stability as well as engineering mechanisms to speed up by a factor of five its training time, hence introducing a novel version of this architecture named SLi-GRU. Then, we evaluate its performance both on a toy task illustrating its newly acquired capabilities and a set of three different ASR datasets demonstrating lower word error rates compared to more complex recurrent neural networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/26/2018

Light Gated Recurrent Units for Speech Recognition

A field that has directly benefited from the recent advances in deep lea...
research
09/28/2016

Memory Visualization for Gated Recurrent Neural Networks in Speech Recognition

Recurrent neural networks (RNNs) have shown clear superiority in sequenc...
research
03/31/2016

Minimal Gated Unit for Recurrent Neural Networks

Recently recurrent neural networks (RNN) has been very successful in han...
research
05/21/2019

Improving Minimal Gated Unit for Sequential Data

In order to obtain a model which can process sequential data related to ...
research
11/13/2020

On the stability properties of Gated Recurrent Units neural networks

The goal of this paper is to provide sufficient conditions for guarantee...
research
07/21/2022

Bayesian Recurrent Units and the Forward-Backward Algorithm

Using Bayes's theorem, we derive a unit-wise recurrence as well as a bac...
research
05/05/2017

A comprehensive study of batch construction strategies for recurrent neural networks in MXNet

In this work we compare different batch construction methods for mini-ba...

Please sign up or login with your details

Forgot password? Click here to reset