Sequence Modeling with Unconstrained Generation Order

11/01/2019
by   Dmitrii Emelianenko, et al.
0

The dominant approach to sequence generation is to produce a sequence in some predefined order, e.g. left to right. In contrast, we propose a more general model that can generate the output sequence by inserting tokens in any arbitrary order. Our model learns decoding order as a result of its training procedure. Our experiments show that this model is superior to fixed order models on a number of sequence generation tasks, such as Machine Translation, Image-to-LaTeX and Image Captioning.

READ FULL TEXT

page 9

page 14

page 15

research
02/15/2018

Teaching Machines to Code: Neural Markup Generation with Visual Attention

We present a deep recurrent neural network model with soft visual attent...
research
05/25/2022

Conditional set generation using Seq2seq models

Conditional set generation learns a mapping from an input sequence of to...
research
05/24/2021

One2Set: Generating Diverse Keyphrases as a Set

Recently, the sequence-to-sequence models have made remarkable progress ...
research
02/04/2019

Insertion-based Decoding with Automatically Inferred Generation Order

Conventional neural autoregressive decoding commonly assumes a left-to-r...
research
04/26/2019

Knowing When to Stop: Evaluation and Verification of Conformity to Output-size Specifications

Models such as Sequence-to-Sequence and Image-to-Sequence are widely use...
research
12/16/2021

Learning and Analyzing Generation Order for Undirected Sequence Models

Undirected neural sequence models have achieved performance competitive ...
research
06/09/2015

Scheduled Sampling for Sequence Prediction with Recurrent Neural Networks

Recurrent Neural Networks can be trained to produce sequences of tokens ...

Please sign up or login with your details

Forgot password? Click here to reset