DeepAI AI Chat
Log In Sign Up

Compositional pre-training for neural semantic parsing

05/27/2019
by   Amir Ziai, et al.
Stanford University
0

Semantic parsing is the process of translating natural language utterances into logical forms, which has many important applications such as question answering and instruction following. Sequence-to-sequence models have been very successful across many NLP tasks. However, a lack of task-specific prior knowledge can be detrimental to the performance of these models. Prior work has used frameworks for inducing grammars over the training examples, which capture conditional independence properties that the model can leverage. Inspired by the recent success stories such as BERT we set out to extend this augmentation framework into two stages. The first stage is to pre-train using a corpus of augmented examples in an unsupervised manner. The second stage is to fine-tune to a domain-specific task. In addition, since the pre-training stage is separate from the training on the main task we also expand the universe of possible augmentations without causing catastrophic inference. We also propose a novel data augmentation strategy that interchanges tokens that co-occur in similar contexts to produce new training pairs. We demonstrate that the proposed two-stage framework is beneficial for improving the parsing accuracy in a standard dataset called GeoQuery for the task of generating logical forms from a set of questions about the US geography.

READ FULL TEXT

page 1

page 2

page 3

page 4

04/05/2020

TAPAS: Weakly Supervised Table Parsing via Pre-training

Answering natural language questions over tables is usually seen as a se...
06/11/2016

Data Recombination for Neural Semantic Parsing

Modeling crisp logical regularities is crucial in semantic parsing, maki...
12/14/2021

Improving Compositional Generalization with Latent Structure and Data Augmentation

Generic unstructured neural networks have been shown to struggle on out-...
05/27/2020

Unsupervised Dual Paraphrasing for Two-stage Semantic Parsing

One daunting problem for semantic parsing is the scarcity of annotation....
05/21/2019

Generating Logical Forms from Graph Representations of Text and Entities

Structured information about entities is critical for many semantic pars...
01/30/2022

Compositionality as Lexical Symmetry

Standard deep network models lack the inductive biases needed to general...
02/02/2021

On Robustness of Neural Semantic Parsers

Semantic parsing maps natural language (NL) utterances into logical form...