DeepAI AI Chat
Log In Sign Up

RETURNN: The RWTH Extensible Training framework for Universal Recurrent Neural Networks

by   Patrick Doetsch, et al.
RWTH Aachen University

In this work we release our extensible and easily configurable neural network training software. It provides a rich set of functional layers with a particular focus on efficient training of recurrent neural network topologies on multiple GPUs. The source of the software package is public and freely available for academic research purposes and can be used as a framework or as a standalone tool which supports a flexible configuration. The software allows to train state-of-the-art deep bidirectional long short-term memory (LSTM) models on both one dimensional data like speech or two dimensional data like handwritten text and was used to develop successful submission systems in several evaluation campaigns.


page 1

page 2

page 3

page 4


Part-of-Speech Tagging with Bidirectional Long Short-Term Memory Recurrent Neural Network

Bidirectional Long Short-Term Memory Recurrent Neural Network (BLSTM-RNN...

A Comprehensive Study of Deep Bidirectional LSTM RNNs for Acoustic Modeling in Speech Recognition

We present a comprehensive study of deep bidirectional long short-term m...

Analysis of memory in LSTM-RNNs for source separation

Long short-term memory recurrent neural networks (LSTM-RNNs) are conside...

Automatic classification of eclipsing binary stars using deep learning methods

In the last couple of decades, tremendous progress has been achieved in ...

Frame Stacking and Retaining for Recurrent Neural Network Acoustic Model

Frame stacking is broadly applied in end-to-end neural network training ...

Deep Recurrent Learning for Heart Sounds Segmentation based on Instantaneous Frequency Features

In this work, a novel stack of well-known technologies is presented to d...

No Padding Please: Efficient Neural Handwriting Recognition

Neural handwriting recognition (NHR) is the recognition of handwritten t...