Seq2Biseq: Bidirectional Output-wise Recurrent Neural Networks for Sequence Modelling

04/09/2019
by   Marco Dinarelli, et al.
0

During the last couple of years, Recurrent Neural Networks (RNN) have reached state-of-the-art performances on most of the sequence modelling problems. In particular, the "sequence to sequence" model and the neural CRF have proved to be very effective in this domain. In this article, we propose a new RNN architecture for sequence labelling, leveraging gated recurrent layers to take arbitrarily long contexts into account, and using two decoders operating forward and backward. We compare several variants of the proposed solution and their performances to the state-of-the-art. Most of our results are better than the state-of-the-art or very close to it and thanks to the use of recent technologies, our architecture can scale on corpora larger than those used in this work.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/08/2016

Improving Recurrent Neural Networks For Sequence Labelling

In this paper we study different types of Recurrent Neural Networks (RNN...
research
09/16/2019

Hybrid Neural Models For Sequence Modelling: The Best Of Three Worlds

We propose a neural architecture with the main characteristics of the mo...
research
07/21/2023

On the Universality of Linear Recurrences Followed by Nonlinear Projections

In this note (work in progress towards a full-length paper) we show that...
research
12/16/2022

Preventing RNN from Using Sequence Length as a Feature

Recurrent neural networks are deep learning topologies that can be train...
research
01/13/2020

Residual Attention Net for Superior Cross-Domain Time Sequence Modeling

We present a novel architecture, residual attention net (RAN), which mer...
research
05/17/2016

Deep Action Sequence Learning for Causal Shape Transformation

Deep learning became the method of choice in recent year for solving a w...
research
06/13/2016

Neural Associative Memory for Dual-Sequence Modeling

Many important NLP problems can be posed as dual-sequence or sequence-to...

Please sign up or login with your details

Forgot password? Click here to reset