A Generalized Framework of Sequence Generation with Application to Undirected Sequence Models

05/29/2019
by   Elman Mansimov, et al.
2

Undirected neural sequence models such as BERT have received renewed interest due to their success on discriminative natural language understanding tasks such as question-answering and natural language inference. The problem of generating sequences directly from these models has received relatively little attention, in part because generating from such models departs significantly from the conventional approach of monotonic generation in directed sequence models. We investigate this problem by first proposing a generalized model of sequence generation that unifies decoding in directed and undirected models. The proposed framework models the process of generation rather than a resulting sequence, and under this framework, we derive various neural sequence models as special cases, such as autoregressive, semi-autoregressive, and refinement-based non-autoregressive models. This unification enables us to adapt decoding algorithms originally developed for directed sequence models to undirected models. We demonstrate this by evaluating various decoding strategies for the recently proposed cross-lingual masked translation model. Our experiments reveal that generation from undirected sequence models, under our framework, is competitive against the state of the art on WMT'14 English-German translation. We furthermore observe that the proposed approach enables constant-time translation while losing only 1 BLEU score compared to linear-time translation from the same undirected neural sequence model.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/13/2020

Incorporating BERT into Parallel Sequence Decoding with Adapters

While large scale pre-trained language models such as BERT have achieved...
research
10/25/2019

Fast Structured Decoding for Sequence Models

Autoregressive sequence models achieve state-of-the-art performance in d...
research
10/08/2020

Don't Parse, Insert: Multilingual Semantic Parsing with Insertion Based Decoding

Semantic parsing is one of the key components of natural language unders...
research
10/03/2022

A Non-monotonic Self-terminating Language Model

Recent large-scale neural autoregressive sequence models have shown impr...
research
12/16/2021

Learning and Analyzing Generation Order for Undirected Sequence Models

Undirected neural sequence models have achieved performance competitive ...
research
10/30/2020

VECO: Variable Encoder-decoder Pre-training for Cross-lingual Understanding and Generation

Recent studies about learning multilingual representations have achieved...
research
10/11/2022

Viterbi Decoding of Directed Acyclic Transformer for Non-Autoregressive Machine Translation

Non-autoregressive models achieve significant decoding speedup in neural...

Please sign up or login with your details

Forgot password? Click here to reset