Efficiency Evaluation of Character-level RNN Training Schedules

05/09/2016
by   Cedric De Boom, et al.
0

We present four training and prediction schedules from the same character-level recurrent neural network. The efficiency of these schedules is tested in terms of model effectiveness as a function of training time and amount of training data seen. We show that the choice of training and prediction schedule potentially has a considerable impact on the prediction effectiveness for a given training budget.

READ FULL TEXT

page 1

page 2

page 3

research
09/11/2017

Recurrent neural networks based Indic word-wise script identification using character-wise training

This paper presents a novel methodology of Indic handwritten script reco...
research
12/30/2015

Online Keyword Spotting with a Character-Level Recurrent Neural Network

In this paper, we propose a context-aware keyword spotting model employi...
research
09/13/2016

Character-Level Language Modeling with Hierarchical Recurrent Neural Networks

Recurrent neural network (RNN) based character-level language models (CL...
research
01/02/2018

Character-level Recurrent Neural Networks in Practice: Comparing Training and Sampling Schemes

Recurrent neural networks are nowadays successfully used in an abundance...
research
01/11/2020

Authorship Attribution in Bangla literature using Character-level CNN

Characters are the smallest unit of text that can extract stylometric si...
research
11/26/2019

An Optimized and Energy-Efficient Parallel Implementation of Non-Iteratively Trained Recurrent Neural Networks

Recurrent neural networks (RNN) have been successfully applied to variou...
research
10/24/2016

Surprisal-Driven Zoneout

We propose a novel method of regularization for recurrent neural network...

Please sign up or login with your details

Forgot password? Click here to reset