GPoeT-2: A GPT-2 Based Poem Generator

05/18/2022
by   Kai-Ling Lo, et al.
0

This project aims to produce the next volume of machine-generated poetry, a complex art form that can be structured and unstructured, and carries depth in the meaning between the lines. GPoeT-2 is based on fine-tuning a state of the art natural language model (i.e. GPT-2) to generate limericks, typically humorous structured poems consisting of five lines with a AABBA rhyming scheme. With a two-stage generation system utilizing both forward and reverse language modeling, GPoeT-2 is capable of freely generating limericks in diverse topics while following the rhyming structure without any seed phrase or a posteriori constraints.Based on the automated generation process, we explore a wide variety of evaluation metrics to quantify "good poetry," including syntactical correctness, lexical diversity, and subject continuity. Finally, we present a collection of 94 categorized limericks that rank highly on the explored "good poetry" metrics to provoke human creativity.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/24/2023

Large Language Models are Effective Table-to-Text Generators, Evaluators, and Feedback Providers

Large language models (LLMs) have shown remarkable ability on controllab...
research
05/05/2020

Russian Natural Language Generation: Creation of a Language Modelling Dataset and Evaluation with Modern Neural Architectures

Generating coherent, grammatically correct, and meaningful text is very ...
research
04/06/2020

Evaluating the Evaluation of Diversity in Natural Language Generation

Despite growing interest in natural language generation (NLG) models tha...
research
04/14/2020

A Human Evaluation of AMR-to-English Generation Systems

Most current state-of-the art systems for generating English text from A...
research
02/12/2020

Learning to Compare for Better Training and Evaluation of Open Domain Natural Language Generation Models

Automated evaluation of open domain natural language generation (NLG) mo...
research
09/28/2021

Generating texts under constraint through discriminator-guided MCTS

Large pre-trained language models (LM) based on Transformers allow to ge...

Please sign up or login with your details

Forgot password? Click here to reset