A Recipe For Arbitrary Text Style Transfer with Large Language Models

09/08/2021
by   Emily Reif, et al.
0

In this paper, we leverage large language models (LMs) to perform zero-shot text style transfer. We present a prompting method that we call augmented zero-shot learning, which frames style transfer as a sentence rewriting task and requires only a natural language instruction, without model fine-tuning or exemplars in the target style. Augmented zero-shot learning is simple and demonstrates promising results not just on standard style transfer tasks such as sentiment, but also on arbitrary transformations such as "make this melodramatic" or "insert a metaphor."

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/23/2022

Prompt-and-Rerank: A Method for Zero-Shot and Few-Shot Arbitrary Textual Style Transfer with Small Language Models

We propose a method for arbitrary textual style transfer (TST)–the task ...
research
05/24/2023

Instruction Tuning with Lexicons for Zero-Shot Style Classification

Style is used to convey authors' intentions and attitudes. Despite the s...
research
11/13/2017

Zero-Shot Style Transfer in Text Using Recurrent Neural Networks

Zero-shot translation is the task of translating between a language pair...
research
03/23/2021

Detecting Hate Speech with GPT-3

Sophisticated language models such as OpenAI's GPT-3 can generate hatefu...
research
02/16/2023

Conversation Style Transfer using Few-Shot Learning

Conventional text style transfer approaches for natural language focus o...
research
04/24/2023

Master: Meta Style Transformer for Controllable Zero-Shot and Few-Shot Artistic Style Transfer

Transformer-based models achieve favorable performance in artistic style...
research
10/27/2022

He Said, She Said: Style Transfer for Shifting the Perspective of Dialogues

In this work, we define a new style transfer task: perspective shift, wh...

Please sign up or login with your details

Forgot password? Click here to reset