TaBERT: Pretraining for Joint Understanding of Textual and Tabular Data

05/17/2020
by   Pengcheng Yin, et al.
0

Recent years have witnessed the burgeoning of pretrained language models (LMs) for text-based natural language (NL) understanding tasks. Such models are typically trained on free-form NL text, hence may not be suitable for tasks like semantic parsing over structured data, which require reasoning over both free-form NL questions and structured tabular data (e.g., database tables). In this paper we present TaBERT, a pretrained LM that jointly learns representations for NL sentences and (semi-)structured tables. TaBERT is trained on a large corpus of 26 million tables and their English contexts. In experiments, neural semantic parsers using TaBERT as feature representation layers achieve new best results on the challenging weakly-supervised semantic parsing benchmark WikiTableQuestions, while performing competitively on the text-to-SQL dataset Spider. Implementation of the model will be available at http://fburl.com/TaBERT .

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/18/2021

Constrained Language Models Yield Few-Shot Semantic Parsers

We explore the use of large pretrained language models as few-shot seman...
research
09/29/2020

GraPPa: Grammar-Augmented Pre-Training for Table Semantic Parsing

We present GraPPa, an effective pre-training approach for table semantic...
research
04/16/2022

Logical Inference for Counting on Semi-structured Tables

Recently, the Natural Language Inference (NLI) task has been studied for...
research
05/13/2020

INFOTABS: Inference on Tables as Semi-structured Data

In this paper, we observe that semi-structured tabulated text is ubiquit...
research
10/05/2018

Scalable Micro-planned Generation of Discourse from Structured Data

We present a framework for generating natural language description from ...
research
05/23/2022

QASem Parsing: Text-to-text Modeling of QA-based Semantics

Several recent works have suggested to represent semantic relations with...
research
12/16/2021

Masked Measurement Prediction: Learning to Jointly Predict Quantities and Units from Textual Context

Physical measurements constitute a large portion of numbers in academic ...

Please sign up or login with your details

Forgot password? Click here to reset