The Importance of Generation Order in Language Modeling

08/23/2018
by   Nicolas Ford, et al.
0

Neural language models are a critical component of state-of-the-art systems for machine translation, summarization, audio transcription, and other tasks. These language models are almost universally autoregressive in nature, generating sentences one token at a time from left to right. This paper studies the influence of token generation order on model quality via a novel two-pass language model that produces partially-filled sentence "templates" and then fills in missing tokens. We compare various strategies for structuring these two passes and observe a surprisingly large variation in model quality. We find the most effective strategy generates function words in the first pass followed by content words in the second. We believe these experimental results justify a more extensive investigation of generation order for neural language models.

READ FULL TEXT
research
12/20/2021

Spiral Language Modeling

In almost all text generation applications, word sequences are construct...
research
09/03/2019

The Bottom-up Evolution of Representations in the Transformer: A Study with Machine Translation and Language Modeling Objectives

We seek to understand how the representations of individual tokens and t...
research
05/17/2023

Token-wise Decomposition of Autoregressive Language Model Hidden States for Analyzing Model Predictions

While there is much recent interest in studying why Transformer-based la...
research
10/10/2022

Metaphorical Paraphrase Generation: Feeding Metaphorical Language Models with Literal Texts

This study presents a new approach to metaphorical paraphrase generation...
research
05/25/2023

Sequential Integrated Gradients: a simple but effective method for explaining language models

Several explanation methods such as Integrated Gradients (IG) can be cha...
research
12/21/2015

Backward and Forward Language Modeling for Constrained Sentence Generation

Recent language models, especially those based on recurrent neural netwo...
research
02/08/2020

Blank Language Models

We propose Blank Language Model (BLM), a model that generates sequences ...

Please sign up or login with your details

Forgot password? Click here to reset