DeepAI AI Chat
Log In Sign Up

Comparison of Diverse Decoding Methods from Conditional Language Models

06/14/2019
by   Daphne Ippolito, et al.
University of Pennsylvania
0

While conditional language models have greatly improved in their ability to output high-quality natural language, many NLP applications benefit from being able to generate a diverse set of candidate sequences. Diverse decoding strategies aim to, within a given-sized candidate list, cover as much of the space of high-quality outputs as possible, leading to improvements for tasks that re-rank and combine candidate outputs. Standard decoding methods, such as beam search, optimize for generating high likelihood sequences rather than diverse ones, though recent work has focused on increasing diversity in these methods. In this work, we perform an extensive survey of decoding-time strategies for generating diverse outputs from conditional language models. We also show how diversity can be improved without sacrificing quality by over-sampling additional candidates, then filtering to the desired number.

READ FULL TEXT

page 1

page 2

page 3

page 4

09/30/2022

Calibrating Sequence likelihood Improves Conditional Language Generation

Conditional language models are predominantly trained with maximum likel...
10/18/2022

Arithmetic Sampling: Parallel Diverse Decoding for Large Language Models

Decoding methods for large language models often trade-off between diver...
03/23/2023

A Simple Explanation for the Phase Transition in Large Language Models with List Decoding

Various recent experimental results show that large language models (LLM...
11/14/2022

Follow the Wisdom of the Crowd: Effective Text Generation via Minimum Bayes Risk Decoding

In open-ended natural-language generation, existing text decoding method...
11/12/2018

Combining Learned Lyrical Structures and Vocabulary for Improved Lyric Generation

The use of language models for generating lyrics and poetry has received...
10/22/2021

Lightweight Decoding Strategies for Increasing Specificity

Language models are known to produce vague and generic outputs. We propo...
09/30/2022

Out-of-Distribution Detection and Selective Generation for Conditional Language Models

Machine learning algorithms typically assume independent and identically...