DeepAI
Log In Sign Up

Table-To-Text generation and pre-training with TabT5

10/17/2022
by   Ewa Andrejczuk, et al.
0

Encoder-only transformer models have been successfully applied to different table understanding tasks, as in TAPAS (Herzig et al., 2020). A major limitation of these architectures is that they are constrained to classification-like tasks such as cell selection or entailment detection. We present TABT5, an encoder-decoder model that generates natural language text based on tables and textual inputs. TABT5 overcomes the encoder-only limitation by incorporating a decoder component and leverages the input structure with table specific embeddings and pre-training. TABT5 achieves new state-of-the-art results on several domains, including spreadsheet formula prediction with a 15 increase in sequence accuracy, QA with a 2.5 data-to-text generation with a 2.5

READ FULL TEXT

page 1

page 2

page 3

page 4

08/22/2019

Denoising based Sequence-to-Sequence Pre-training for Text Generation

This paper presents a new sequence-to-sequence (seq2seq) pre-training me...
12/20/2019

A Hierarchical Model for Data-to-Text Generation

Transcribing structured data into natural language descriptions has emer...
10/01/2020

Understanding tables with intermediate pre-training

Table entailment, the binary classification task of finding if a sentenc...
06/30/2018

Title Generation for Web Tables

Descriptive titles provide crucial context for interpreting tables that ...
11/09/2019

Table-to-Text Natural Language Generation with Unseen Schemas

Traditional table-to-text natural language generation (NLG) tasks focus ...
08/08/2019

Key Fact as Pivot: A Two-Stage Model for Low Resource Table-to-Text Generation

Table-to-text generation aims to translate the structured data into the ...
06/08/2022

STable: Table Generation Framework for Encoder-Decoder Models

The output structure of database-like tables, consisting of values struc...