On The Ingredients of an Effective Zero-shot Semantic Parser

10/15/2021
by   Pengcheng Yin, et al.
0

Semantic parsers map natural language utterances into meaning representations (e.g., programs). Such models are typically bottlenecked by the paucity of training data due to the required laborious annotation efforts. Recent studies have performed zero-shot learning by synthesizing training examples of canonical utterances and programs from a grammar, and further paraphrasing these utterances to improve linguistic diversity. However, such synthetic examples cannot fully capture patterns in real data. In this paper we analyze zero-shot parsers through the lenses of the language and logical gaps (Herzig and Berant, 2019), which quantify the discrepancy of language and programmatic patterns between the canonical examples and real-world user-issued ones. We propose bridging these gaps using improved grammars, stronger paraphrasers, and efficient learning methods using canonical examples that most likely reflect real user intents. Our model achieves strong performance on two semantic parsing benchmarks (Scholar, Geo) with zero labeled data.

READ FULL TEXT
research
04/21/2018

Decoupling Structure and Lexicon for Zero-Shot Semantic Parsing

Building a semantic parser quickly in a new domain is a fundamental chal...
research
09/16/2020

Grounded Adaptation for Zero-shot Executable Semantic Parsing

We propose Grounded Adaptation for Zero-shot Executable Semantic Parsing...
research
09/11/2020

Visually Analyzing and Steering Zero Shot Learning

We propose a visual analytics system to help a user analyze and steer ze...
research
11/20/2019

Zero-Shot Semantic Parsing for Instructions

We consider a zero-shot semantic parsing task: parsing instructions into...
research
09/18/2023

Few-Shot Adaptation for Parsing Contextual Utterances with LLMs

We evaluate the ability of semantic parsers based on large language mode...
research
07/08/2016

Collaborative Training of Tensors for Compositional Distributional Semantics

Type-based compositional distributional semantic models present an inter...
research
05/15/2022

SeqZero: Few-shot Compositional Semantic Parsing with Sequential Prompts and Zero-shot Models

Recent research showed promising results on combining pretrained languag...

Please sign up or login with your details

Forgot password? Click here to reset