Collocation2Text: Controllable Text Generation from Guide Phrases in Russian

06/18/2022
by   Sergey Vychegzhanin, et al.
0

Large pre-trained language models are capable of generating varied and fluent texts. Starting from the prompt, these models generate a narrative that can develop unpredictably. The existing methods of controllable text generation, which guide the narrative in the text in the user-specified direction, require creating a training corpus and an additional time-consuming training procedure. The paper proposes and investigates Collocation2Text, a plug-and-play method for automatic controllable text generation in Russian, which does not require fine-tuning. The method is based on two interacting models: the autoregressive language ruGPT-3 model and the autoencoding language ruRoBERTa model. The idea of the method is to shift the output distribution of the autoregressive model according to the output distribution of the autoencoding model in order to ensure a coherent transition of the narrative in the text towards the guide phrase, which can contain single words or collocations. The autoencoding model, which is able to take into account the left and right contexts of the token, "tells" the autoregressive model which tokens are the most and least logical at the current generation step, increasing or decreasing the probabilities of the corresponding tokens. The experiments on generating news articles using the proposed method showed its effectiveness for automatically generated fluent texts which contain coherent transitions between user-specified phrases.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/24/2022

ELMER: A Non-Autoregressive Pre-trained Language Model for Efficient and Effective Text Generation

We study the text generation task under the approach of pre-trained lang...
research
05/15/2022

Classifiers are Better Experts for Controllable Text Generation

This paper proposes a simple method for controllable text generation bas...
research
09/20/2021

A Plug-and-Play Method for Controlled Text Generation

Large pre-trained language models have repeatedly shown their ability to...
research
02/05/2019

Non-Monotonic Sequential Text Generation

Standard sequential generation methods assume a pre-specified generation...
research
04/27/2023

SweCTRL-Mini: a data-transparent Transformer-based large language model for controllable text generation in Swedish

We present SweCTRL-Mini, a large Swedish language model that can be used...
research
03/29/2021

Changing the Mind of Transformers for Topically-Controllable Language Generation

Large Transformer-based language models can aid human authors by suggest...
research
07/13/2023

Copy Is All You Need

The dominant text generation models compose the output by sequentially s...

Please sign up or login with your details

Forgot password? Click here to reset