Online Sequence Training of Recurrent Neural Networks with Connectionist Temporal Classification

11/21/2015
by   Kyuyeon Hwang, et al.
0

Connectionist temporal classification (CTC) based supervised sequence training of recurrent neural networks (RNNs) has shown great success in many machine learning areas including end-to-end speech and handwritten character recognition. For the CTC training, however, it is required to unroll (or unfold) the RNN by the length of an input sequence. This unrolling requires a lot of memory and hinders a small footprint implementation of online learning or adaptation. Furthermore, the length of training sequences is usually not uniform, which makes parallel training with multiple sequences inefficient on shared memory models such as graphics processing units (GPUs). In this work, we introduce an expectation-maximization (EM) based online CTC algorithm that enables unidirectional RNNs to learn sequences that are longer than the amount of unrolling. The RNNs can also be trained to process an infinitely long input sequence without pre-segmentation or external reset. Moreover, the proposed approach allows efficient parallel training on GPUs. For evaluation, phoneme recognition and end-to-end speech recognition examples are presented on the TIMIT and Wall Street Journal (WSJ) corpora, respectively. Our online model achieves 20.7 generated by concatenating all 192 utterances in the TIMIT core test set. On WSJ, a network can be trained with only 64 times of unrolling while sacrificing 4.5

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/22/2013

Speech Recognition with Deep Recurrent Neural Networks

Recurrent neural networks (RNNs) are a powerful model for sequential dat...
research
01/25/2016

Character-Level Incremental Speech Recognition with Recurrent Neural Networks

In real-time speech recognition applications, the latency is an importan...
research
10/10/2016

Latent Sequence Decompositions

We present the Latent Sequence Decompositions (LSD) framework. LSD decom...
research
12/16/2016

Delta Networks for Optimized Recurrent Network Computation

Many neural networks exhibit stability in their activation patterns over...
research
07/21/2018

Inductive Visual Localisation: Factorised Training for Superior Generalisation

End-to-end trained Recurrent Neural Networks (RNNs) have been successful...
research
05/24/2019

On Recurrent Neural Networks for Sequence-based Processing in Communications

In this work, we analyze the capabilities and practical limitations of n...
research
04/26/2018

Sparse Persistent RNNs: Squeezing Large Recurrent Networks On-Chip

Recurrent Neural Networks (RNNs) are powerful tools for solving sequence...

Please sign up or login with your details

Forgot password? Click here to reset