Deterministic Non-Autoregressive Neural Sequence Modeling by Iterative Refinement

02/19/2018
by   Jason Lee, et al.
0

We propose a conditional non-autoregressive neural sequence model based on iterative refinement. The proposed model is designed based on the principles of latent variable models and denoising autoencoders, and is generally applicable to any sequence generation task. We extensively evaluate the proposed model on machine translation (En-De and En-Ro) and image caption generation, and observe that it significantly speeds up decoding while maintaining the generation quality comparable to the autoregressive counterpart.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/20/2019

Latent-Variable Non-Autoregressive Neural Machine Translation with Deterministic Inference using a Delta Posterior

Although neural machine translation models reached high translation qual...
research
09/05/2019

FlowSeq: Non-Autoregressive Conditional Sequence Generation with Generative Flow

Most sequence-to-sequence (seq2seq) models are autoregressive; they gene...
research
11/12/2018

End-to-End Non-Autoregressive Neural Machine Translation with Connectionist Temporal Classification

Autoregressive decoding is the only part of sequence-to-sequence models ...
research
06/03/2019

Masked Non-Autoregressive Image Captioning

Existing captioning models often adopt the encoder-decoder architecture,...
research
09/15/2020

Iterative Refinement in the Continuous Space for Non-Autoregressive Neural Machine Translation

We propose an efficient inference procedure for non-autoregressive machi...
research
04/28/2021

Learning deep autoregressive models for hierarchical data

We propose a model for hierarchical structured data as an extension to t...
research
04/02/2023

FANS: Fast Non-Autoregressive Sequence Generation for Item List Continuation

User-curated item lists, such as video-based playlists on Youtube and bo...

Please sign up or login with your details

Forgot password? Click here to reset