Few-Shot Table-to-Text Generation with Prompt Planning and Knowledge Memorization

02/09/2023
by   Zhixin Guo, et al.
0

Pre-trained language models (PLM) have achieved remarkable advancement in table-to-text generation tasks. However, the lack of labeled domain-specific knowledge and the topology gap between tabular data and text make it difficult for PLMs to yield faithful text. Low-resource generation likewise faces unique challenges in this domain. Inspired by how humans descript tabular data with prior knowledge, we suggest a new framework: PromptMize, which targets table-to-text generation under few-shot settings. The design of our framework consists of two aspects: a prompt planner and a knowledge adapter. The prompt planner aims to generate a prompt signal that provides instance guidance for PLMs to bridge the topology gap between tabular data and text. Moreover, the knowledge adapter memorizes domain-specific knowledge from the unlabelled corpus to supply essential information during generation. Extensive experiments and analyses are investigated on three open domain few-shot NLG datasets: human, song, and book. Compared with previous state-of-the-art approaches, our model achieves remarkable performance in generating quality as judged by human and automatic evaluations.

READ FULL TEXT
research
02/24/2023

Few-Shot Table-to-Text Generation with Prompt-based Adapter

Pre-trained language models (PLMs) have made remarkable progress in tabl...
research
08/23/2022

Few-Shot Table-to-Text Generation with Prefix-Controlled Generator

Neural table-to-text generation approaches are data-hungry, limiting the...
research
08/27/2021

Few-Shot Table-to-Text Generation with Prototype Memory

Neural table-to-text generation models have achieved remarkable progress...
research
03/01/2022

Attend, Memorize and Generate: Towards Faithful Table-to-Text Generation in Few Shots

Few-shot table-to-text generation is a task of composing fluent and fait...
research
02/17/2021

Towards Faithfulness in Open Domain Table-to-text Generation from an Entity-centric View

In open domain table-to-text generation, we notice that the unfaithful g...
research
04/14/2022

Rows from Many Sources: Enriching row completions from Wikidata with a pre-trained Language Model

Row completion is the task of augmenting a given table of text and numbe...
research
05/25/2023

T2TD: Text-3D Generation Model based on Prior Knowledge Guidance

In recent years, 3D models have been utilized in many applications, such...

Please sign up or login with your details

Forgot password? Click here to reset