Controllable Text Generation for Open-Domain Creativity and Fairness

09/24/2022
by   Nanyun Peng, et al.
0

Recent advances in large pre-trained language models have demonstrated strong results in generating natural languages and significantly improved performances for many natural language generation (NLG) applications such as machine translation and text summarization. However, when the generation tasks are more open-ended and the content is under-specified, existing techniques struggle to generate long-term coherent and creative content. Moreover, the models exhibit and even amplify social biases that are learned from the training corpora. This happens because the generation models are trained to capture the surface patterns (i.e. sequences of words), instead of capturing underlying semantics and discourse structures, as well as background knowledge including social norms. In this paper, I introduce our recent works on controllable text generation to enhance the creativity and fairness of language generation models. We explore hierarchical generation and constrained decoding, with applications to creative language generation including story, poetry, and figurative languages, and bias mitigation for generation models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/27/2021

BOLD: Dataset and Metrics for Measuring Biases in Open-Ended Language Generation

Recent advances in deep learning techniques have enabled machines to gen...
research
10/07/2022

Visualize Before You Write: Imagination-Guided Open-Ended Text Generation

Recent advances in text-to-image synthesis make it possible to visualize...
research
06/11/2022

Why is constrained neural language generation particularly challenging?

Recent advances in deep neural language models combined with the capacit...
research
03/06/2022

Recent Advances in Neural Text Generation: A Task-Agnostic Survey

In recent years much effort has been devoted to applying neural models t...
research
10/02/2020

MEGATRON-CNTRL: Controllable Story Generation with External Knowledge Using Large-Scale Language Models

Existing pre-trained large language models have shown unparalleled gener...
research
05/10/2021

Societal Biases in Language Generation: Progress and Challenges

Technology for language generation has advanced rapidly, spurred by adva...
research
02/16/2022

XFBoost: Improving Text Generation with Controllable Decoders

Multimodal conditionality in transformer-based natural language models h...

Please sign up or login with your details

Forgot password? Click here to reset