Comparison of Diverse Decoding Methods from Conditional Language Models

06/14/2019
by   Daphne Ippolito, et al.
0

While conditional language models have greatly improved in their ability to output high-quality natural language, many NLP applications benefit from being able to generate a diverse set of candidate sequences. Diverse decoding strategies aim to, within a given-sized candidate list, cover as much of the space of high-quality outputs as possible, leading to improvements for tasks that re-rank and combine candidate outputs. Standard decoding methods, such as beam search, optimize for generating high likelihood sequences rather than diverse ones, though recent work has focused on increasing diversity in these methods. In this work, we perform an extensive survey of decoding-time strategies for generating diverse outputs from conditional language models. We also show how diversity can be improved without sacrificing quality by over-sampling additional candidates, then filtering to the desired number.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/30/2022

Calibrating Sequence likelihood Improves Conditional Language Generation

Conditional language models are predominantly trained with maximum likel...
research
10/18/2022

Arithmetic Sampling: Parallel Diverse Decoding for Large Language Models

Decoding methods for large language models often trade-off between diver...
research
03/23/2023

A Simple Explanation for the Phase Transition in Large Language Models with List Decoding

Various recent experimental results show that large language models (LLM...
research
11/14/2022

Follow the Wisdom of the Crowd: Effective Text Generation via Minimum Bayes Risk Decoding

In open-ended natural-language generation, existing text decoding method...
research
09/21/2023

Reranking for Natural Language Generation from Logical Forms: A Study based on Large Language Models

Large language models (LLMs) have demonstrated impressive capabilities i...
research
11/12/2018

Combining Learned Lyrical Structures and Vocabulary for Improved Lyric Generation

The use of language models for generating lyrics and poetry has received...
research
10/22/2021

Lightweight Decoding Strategies for Increasing Specificity

Language models are known to produce vague and generic outputs. We propo...

Please sign up or login with your details

Forgot password? Click here to reset