CaM-Gen:Causally-aware Metric-guided Text Generation

10/24/2020
by   Navita Goyal, et al.
0

Content is created for a well-defined purpose, often described by a metric or a signal represented in the form of structured information. The relationship between the metrics or the goal of a target content and the content itself are non-trivial. While large scale language models show promising text generation capabilities, guiding and informing the generated text with external metrics is challenging. These metrics and the content tend to have inherent relationships and not all of them may directly impact the content. We introduce a CaM-Gen: Causally-aware Generative Networks guided by user-defined input metrics incorporating the causal relationships between the metric and the content features. We leverage causal inference techniques to identify the causally significant aspects of text that leads to the target metric and then explicitly guide the generative model towards these by a feedback mechanism. We propose this mechanism for variational autoencoder-based and transformer-based generative models. The proposed models beat baselines in terms of the target metric accuracy while maintaining the fluency and the language quality of the generated text. To the best of our knowledge, this is one of the early attempts at incorporating a metric-guide using causal inference towards controlled generation.

READ FULL TEXT
research
04/02/2022

CTRLEval: An Unsupervised Reference-Free Metric for Evaluating Controlled Text Generation

Existing reference-free metrics have obvious limitations for evaluating ...
research
04/16/2020

Neural Data-to-Text Generation with Dynamic Content Planning

Neural data-to-text generation models have achieved significant advancem...
research
07/02/2023

PatternGPT :A Pattern-Driven Framework for Large Language Model Text Generation

Large language models(LLMS) have shown excellent text generation capabil...
research
06/05/2020

CoCon: A Self-Supervised Approach for Controlled Text Generation

Pretrained Transformer-based language models (LMs) display remarkable na...
research
10/06/2022

FAST: Improving Controllability for Text Generation with Feedback Aware Self-Training

Controllable text generation systems often leverage control codes to dir...
research
05/03/2023

Cheaply Evaluating Inference Efficiency Metrics for Autoregressive Transformer APIs

Large language models (LLMs) power many state-of-the-art systems in natu...
research
09/08/2023

Towards Reliable and Fluent Large Language Models: Incorporating Feedback Learning Loops in QA Systems

Large language models (LLMs) have emerged as versatile tools in various ...

Please sign up or login with your details

Forgot password? Click here to reset