Dropout improves Recurrent Neural Networks for Handwriting Recognition

11/05/2013
by   Vu Pham, et al.
0

Recurrent neural networks (RNNs) with Long Short-Term memory cells currently hold the best known results in unconstrained handwriting recognition. We show that their performance can be greatly improved using dropout - a recently proposed regularization method for deep architectures. While previous works showed that dropout gave superior performance in the context of convolutional networks, it had never been applied to RNNs. In our approach, dropout is carefully used in the network so that it does not affect the recurrent connections, hence the power of RNNs in modeling sequence is preserved. Extensive experiments on a broad range of handwritten databases confirm the effectiveness of dropout on deep architectures even when the network mainly consists of recurrent and shared connections.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/08/2014

Recurrent Neural Network Regularization

We present a simple regularization technique for Recurrent Neural Networ...
research
03/16/2016

Recurrent Dropout without Memory Loss

This paper presents a novel approach to recurrent neural network (RNN) r...
research
08/23/2020

Variational Inference-Based Dropout in Recurrent Neural Networks for Slot Filling in Spoken Language Understanding

This paper proposes to generalize the variational recurrent neural netwo...
research
12/09/2020

Have convolutions already made recurrence obsolete for unconstrained handwritten text recognition ?

Unconstrained handwritten text recognition remains an important challeng...
research
10/21/2014

Regularizing Recurrent Networks - On Injected Noise and Norm-based Methods

Advancements in parallel processing have lead to a surge in multilayer p...
research
05/29/2018

Deep Learning under Privileged Information Using Heteroscedastic Dropout

Unlike machines, humans learn through rapid, abstract model-building. Th...
research
01/31/2021

Fine-tuning Handwriting Recognition systems with Temporal Dropout

This paper introduces a novel method to fine-tune handwriting recognitio...

Please sign up or login with your details

Forgot password? Click here to reset