Memory Visualization for Gated Recurrent Neural Networks in Speech Recognition

by   Zhiyuan Tang, et al.
Tsinghua University

Recurrent neural networks (RNNs) have shown clear superiority in sequence modeling, particularly the ones with gated units, such as long short-term memory (LSTM) and gated recurrent unit (GRU). However, the dynamic properties behind the remarkable performance remain unclear in many applications, e.g., automatic speech recognition (ASR). This paper employs visualization techniques to study the behavior of LSTM and GRU when performing speech recognition tasks. Our experiments show some interesting patterns in the gated memory, and some of them have inspired simple yet effective modifications on the network structure. We report two of such modifications: (1) lazy cell update in LSTM, and (2) shortcut connections for residual learning. Both modifications lead to more comprehensible and powerful networks.


Empirical Evaluation of Gated Recurrent Neural Networks on Sequence Modeling

In this paper we compare different types of recurrent units in recurrent...

From Nodes to Networks: Evolving Recurrent Neural Networks

Gated recurrent networks such as those composed of Long Short-Term Memor...

Stabilising and accelerating light gated recurrent units for automatic speech recognition

The light gated recurrent units (Li-GRU) is well-known for achieving imp...

Gated Recurrent Networks for Seizure Detection

Recurrent Neural Networks (RNNs) with sophisticated units that implement...

Kernel-Based Approaches for Sequence Modeling: Connections to Neural Methods

We investigate time-dependent data analysis from the perspective of recu...

Complex Evolution Recurrent Neural Networks (ceRNNs)

Unitary Evolution Recurrent Neural Networks (uRNNs) have three attractiv...

SiTGRU: Single-Tunnelled Gated Recurrent Unit for Abnormality Detection

Abnormality detection is a challenging task due to the dependence on a s...

Please sign up or login with your details

Forgot password? Click here to reset