FAST: Improving Controllability for Text Generation with Feedback Aware Self-Training

10/06/2022
by   Junyi Chai, et al.
9

Controllable text generation systems often leverage control codes to direct various properties of the output like style and length. Inspired by recent work on causal inference for NLP, this paper reveals a previously overlooked flaw in these control code-based conditional text generation algorithms. Spurious correlations in the training data can lead models to incorrectly rely on parts of the input other than the control code for attribute selection, significantly undermining downstream generation quality and controllability. We demonstrate the severity of this issue with a series of case studies and then propose two simple techniques to reduce these correlations in training sets. The first technique is based on resampling the data according to an example's propensity towards each linguistic attribute (IPS). The second produces multiple counterfactual versions of each example and then uses an additional feedback mechanism to remove noisy examples (feedback aware self-training, FAST). We evaluate on 3 tasks – news headline, meta review, and search ads generation – and demonstrate that FAST can significantly improve the controllability and language quality of generated outputs when compared to state-of-the-art controllable text generation approaches.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/22/2022

A Causal Lens for Controllable Text Generation

Controllable text generation concerns two fundamental tasks of wide appl...
research
09/25/2020

Controllable Text Generation with Focused Variation

This work introduces Focused-Variation Network (FVN), a novel model to c...
research
10/05/2020

CAT-Gen: Improving Robustness in NLP Models via Controlled Adversarial Text Generation

NLP models are shown to suffer from robustness issues, i.e., a model's p...
research
03/30/2023

Self-Refine: Iterative Refinement with Self-Feedback

Like people, LLMs do not always generate the best text for a given gener...
research
12/16/2022

DuNST: Dual Noisy Self Training for Semi-Supervised Controllable Text Generation

Self-training (ST) has prospered again in language understanding by augm...
research
05/05/2023

Stylized Data-to-Text Generation: A Case Study in the E-Commerce Domain

Existing data-to-text generation efforts mainly focus on generating a co...
research
10/24/2020

CaM-Gen:Causally-aware Metric-guided Text Generation

Content is created for a well-defined purpose, often described by a metr...

Please sign up or login with your details

Forgot password? Click here to reset