Incorporating Stylistic Lexical Preferences in Generative Language Models

10/22/2020
by   Hrituraj Singh, et al.
0

While recent advances in language modeling have resulted in powerful generation models, their generation style remains implicitly dependent on the training data and can not emulate a specific target style. Leveraging the generative capabilities of a transformer-based language models, we present an approach to induce certain target-author attributes by incorporating continuous multi-dimensional lexical preferences of an author into generative language models. We introduce rewarding strategies in a reinforcement learning framework that encourages the use of words across multiple categorical dimensions, to varying extents. Our experiments demonstrate that the proposed approach can generate text that distinctively aligns with a given target author's lexical style. We conduct quantitative and qualitative comparisons with competitive and relevant baselines to illustrate the benefits of the proposed approach.

READ FULL TEXT

page 1

page 2

page 3

page 4

09/22/2019

Adapting Language Models for Non-Parallel Author-Stylized Rewriting

Given the recent progress in language modeling using Transformer-based n...
01/28/2021

DRAG: Director-Generator Language Modelling Framework for Non-Parallel Author Stylized Rewriting

Author stylized rewriting is the task of rewriting an input text in a pa...
05/29/2020

A Comparative Study of Lexical Substitution Approaches based on Neural Language Models

Lexical substitution in context is an extremely powerful technology that...
03/02/2018

Syntax-Aware Language Modeling with Recurrent Neural Networks

Neural language models (LMs) are typically trained using only lexical fe...
08/23/2019

Neural Poetry: Learning to Generate Poems using Syllables

Motivated by the recent progresses on machine learning-based models that...
01/05/2022

Formal Analysis of Art: Proxy Learning of Visual Concepts from Style Through Language Models

We present a machine learning system that can quantify fine art painting...
09/11/2019

Learning Dynamic Author Representations with Temporal Language Models

Language models are at the heart of numerous works, notably in the text ...