A Generalized Framework of Sequence Generation with Application to Undirected Sequence Models

by   Elman Mansimov, et al.
NYU college

Undirected neural sequence models such as BERT have received renewed interest due to their success on discriminative natural language understanding tasks such as question-answering and natural language inference. The problem of generating sequences directly from these models has received relatively little attention, in part because generating from such models departs significantly from the conventional approach of monotonic generation in directed sequence models. We investigate this problem by first proposing a generalized model of sequence generation that unifies decoding in directed and undirected models. The proposed framework models the process of generation rather than a resulting sequence, and under this framework, we derive various neural sequence models as special cases, such as autoregressive, semi-autoregressive, and refinement-based non-autoregressive models. This unification enables us to adapt decoding algorithms originally developed for directed sequence models to undirected models. We demonstrate this by evaluating various decoding strategies for the recently proposed cross-lingual masked translation model. Our experiments reveal that generation from undirected sequence models, under our framework, is competitive against the state of the art on WMT'14 English-German translation. We furthermore observe that the proposed approach enables constant-time translation while losing only 1 BLEU score compared to linear-time translation from the same undirected neural sequence model.


page 1

page 2

page 3

page 4


Incorporating BERT into Parallel Sequence Decoding with Adapters

While large scale pre-trained language models such as BERT have achieved...

Fast Structured Decoding for Sequence Models

Autoregressive sequence models achieve state-of-the-art performance in d...

Don't Parse, Insert: Multilingual Semantic Parsing with Insertion Based Decoding

Semantic parsing is one of the key components of natural language unders...

A Non-monotonic Self-terminating Language Model

Recent large-scale neural autoregressive sequence models have shown impr...

Learning and Analyzing Generation Order for Undirected Sequence Models

Undirected neural sequence models have achieved performance competitive ...

VECO: Variable Encoder-decoder Pre-training for Cross-lingual Understanding and Generation

Recent studies about learning multilingual representations have achieved...

Viterbi Decoding of Directed Acyclic Transformer for Non-Autoregressive Machine Translation

Non-autoregressive models achieve significant decoding speedup in neural...

Code Repositories

Please sign up or login with your details

Forgot password? Click here to reset