Classifiers are Better Experts for Controllable Text Generation

05/15/2022
by   Askhat Sitdikov, et al.
3

This paper proposes a simple method for controllable text generation based on weighting logits produced, namely CAIF sampling. Using an arbitrary third-party text classifier, we adjust a small part of a language model's logits and guide text generation towards or away from classifier prediction. We show that the proposed method significantly outperforms recent PPLM, GeDi, and DExperts on PPL and sentiment accuracy based on the external classifier of generated texts. A the same time, it is also easier to implement and tune, and has significantly fewer restrictions and requirements.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/06/2023

Click: Controllable Text Generation with Sequence Likelihood Contrastive Learning

It has always been an important yet challenging problem to control langu...
research
05/09/2023

DeepTextMark: Deep Learning based Text Watermarking for Detection of Large Language Model Generated Text

The capabilities of text generators have grown with the rapid developmen...
research
03/30/2021

AfriKI: Machine-in-the-Loop Afrikaans Poetry Generation

This paper proposes a generative language model called AfriKI. Our appro...
research
03/25/2020

Heavy-tailed Representations, Text Polarity Classification Data Augmentation

The dominant approaches to text representation in natural language rely ...
research
06/18/2022

Collocation2Text: Controllable Text Generation from Guide Phrases in Russian

Large pre-trained language models are capable of generating varied and f...
research
09/25/2020

Weird AI Yankovic: Generating Parody Lyrics

Lyrics parody swaps one set of words that accompany a melody with a new ...
research
06/17/2023

KEST: Kernel Distance Based Efficient Self-Training for Improving Controllable Text Generation

Self-training (ST) has come to fruition in language understanding tasks ...

Please sign up or login with your details

Forgot password? Click here to reset