Improving Recurrent Neural Networks For Sequence Labelling

06/08/2016
by   Marco Dinarelli, et al.
0

In this paper we study different types of Recurrent Neural Networks (RNN) for sequence labeling tasks. We propose two new variants of RNNs integrating improvements for sequence labeling, and we compare them to the more traditional Elman and Jordan RNNs. We compare all models, either traditional or new, on four distinct tasks of sequence labeling: two on Spoken Language Understanding (ATIS and MEDIA); and two of POS tagging for the French Treebank (FTB) and the Penn Treebank (PTB) corpora. The results show that our new variants of RNNs are always more effective than the others.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/06/2017

Label-Dependencies Aware Recurrent Neural Networks

In the last few years, Recurrent Neural Networks (RNNs) have proved effe...
research
06/20/2017

Effective Spoken Language Labeling with Deep Recurrent Neural Networks

Understanding spoken language is a highly complex problem, which can be ...
research
04/09/2019

Seq2Biseq: Bidirectional Output-wise Recurrent Neural Networks for Sequence Modelling

During the last couple of years, Recurrent Neural Networks (RNN) have re...
research
09/30/2018

Efficient Sequence Labeling with Actor-Critic Training

Neural approaches to sequence labeling often use a Conditional Random Fi...
research
02/18/2020

Assessing the Memory Ability of Recurrent Neural Networks

It is known that Recurrent Neural Networks (RNNs) can remember, in their...
research
06/13/2016

Neural Associative Memory for Dual-Sequence Modeling

Many important NLP problems can be posed as dual-sequence or sequence-to...
research
01/23/2019

How do Mixture Density RNNs Predict the Future?

Gaining a better understanding of how and what machine learning systems ...

Please sign up or login with your details

Forgot password? Click here to reset