Cascaded Text Generation with Markov Transformers

06/01/2020
by   Yuntian Deng, et al.
31

The two dominant approaches to neural text generation are fully autoregressive models, using serial beam search decoding, and non-autoregressive models, using parallel decoding with no output dependencies. This work proposes an autoregressive model with sub-linear parallel time generation. Noting that conditional random fields with bounded context can be decoded in parallel, we propose an efficient cascaded decoding approach for generating high-quality output. To parameterize this cascade, we introduce a Markov transformer, a variant of the popular fully autoregressive model that allows us to simultaneously decode with specific autoregressive context cutoffs. This approach requires only a small modification from standard autoregressive training, while showing competitive accuracy/speed tradeoff compared to existing methods on five machine translation datasets.

READ FULL TEXT
research
04/24/2023

Directed Acyclic Transformer Pre-training for High-quality Non-autoregressive Text Generation

Non-AutoRegressive (NAR) text generation models have drawn much attentio...
research
06/13/2022

On the Learning of Non-Autoregressive Transformers

Non-autoregressive Transformer (NAT) is a family of text generation mode...
research
02/16/2021

Non-Autoregressive Text Generation with Pre-trained Language Models

Non-autoregressive generation (NAG) has recently attracted great attenti...
research
03/21/2021

Non-Autoregressive Translation by Learning Target Categorical Codes

Non-autoregressive Transformer is a promising text generation model. How...
research
08/26/2022

Nearest Neighbor Non-autoregressive Text Generation

Non-autoregressive (NAR) models can generate sentences with less computa...
research
11/30/2022

Fast Inference from Transformers via Speculative Decoding

Inference from large autoregressive models like Transformers is slow - d...
research
04/05/2022

latent-GLAT: Glancing at Latent Variables for Parallel Text Generation

Recently, parallel text generation has received widespread attention due...

Please sign up or login with your details

Forgot password? Click here to reset