PLOG: Table-to-Logic Pretraining for Logical Table-to-Text Generation

05/25/2022
by   Ao Liu, et al.
0

Logical table-to-text generation is a task that involves generating logically faithful sentences from tables, which requires models to derive logical level facts from table records via logical inference. It raises a new challenge on the logical-level content planning of table-to-text models. However, directly learning the logical inference knowledge from table-text pairs is very difficult for neural models because of the ambiguity of natural language and the scarcity of parallel data. Hence even large-scale pre-trained language models present low logical fidelity on logical table-to-text. In this work, we propose a PLOG (Pretrained Logical Form Generator) framework to improve the generation fidelity. Specifically, PLOG is first pretrained on a table-to-logic-form generation (table-to-logic) task, then finetuned on downstream table-to-text tasks. The formal definition of logical forms enables us to collect large amount of accurate logical forms from tables without human annotation. In addition, PLOG can learn logical inference from table-logic pairs much more definitely than from table-text pairs. To evaluate our model, we further collect a controlled logical table-to-text dataset CONTLOG based on an existing dataset. On two benchmarks, LOGICNLG and CONTLOG, PLOG outperforms strong baselines by a large margin on the logical fidelity, demonstrating the effectiveness of table-to-logic pretraining.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/12/2021

Improving Logical-Level Natural Language Generation with Topic-Conditioned Data Augmentation and Logical Form Generation

Logical Natural Language Generation, i.e., generating textual descriptio...
research
02/06/2023

LoFT: Enhancing Faithfulness and Diversity for Table-to-Text Generation via Logic Form Control

Logical Table-to-Text (LT2T) generation is tasked with generating logica...
research
01/05/2023

Towards Table-to-Text Generation with Pretrained Language Model: A Table Structure Understanding and Text Deliberating Approach

Although remarkable progress on the neural table-to-text methods has bee...
research
11/09/2019

Table-to-Text Natural Language Generation with Unseen Schemas

Traditional table-to-text natural language generation (NLG) tasks focus ...
research
10/16/2022

Investigating the Robustness of Natural Language Generation from Logical Forms via Counterfactual Samples

The aim of Logic2Text is to generate controllable and faithful texts con...
research
04/30/2020

Logic2Text: High-Fidelity Natural Language Generation from Logical Forms

Previous works on Natural Language Generation (NLG) from structured data...
research
04/22/2020

Logical Natural Language Generation from Open-Domain Tables

Neural natural language generation (NLG) models have recently shown rema...

Please sign up or login with your details

Forgot password? Click here to reset