SeqZero: Few-shot Compositional Semantic Parsing with Sequential Prompts and Zero-shot Models

05/15/2022
by   Jingfeng Yang, et al.
7

Recent research showed promising results on combining pretrained language models (LMs) with canonical utterance for few-shot semantic parsing. The canonical utterance is often lengthy and complex due to the compositional structure of formal languages. Learning to generate such canonical utterance requires significant amount of data to reach high performance. Fine-tuning with only few-shot samples, the LMs can easily forget pretrained knowledge, overfit spurious biases, and suffer from compositionally out-of-distribution generalization errors. To tackle these issues, we propose a novel few-shot semantic parsing method – SeqZero. SeqZero decomposes the problem into a sequence of sub-problems, which correspond to the sub-clauses of the formal language. Based on the decomposition, the LMs only need to generate short answers using prompts for predicting sub-clauses. Thus, SeqZero avoids generating a long canonical utterance at once. Moreover, SeqZero employs not only a few-shot model but also a zero-shot model to alleviate the overfitting. In particular, SeqZero brings out the merits from both models via ensemble equipped with our proposed constrained rescaling. SeqZero achieves SOTA performance of BART-based models on GeoQuery and EcommerceQuery, which are two few-shot datasets with compositional data split.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/21/2022

ZEROTOP: Zero-Shot Task-Oriented Semantic Parsing using Large Language Models

We explore the use of large language models (LLMs) for zero-shot semanti...
research
09/09/2021

Translate Fill: Improving Zero-Shot Multilingual Semantic Parsing with Synthetic Data

While multilingual pretrained language models (LMs) fine-tuned on a sing...
research
06/11/2021

From Paraphrasing to Semantic Parsing: Unsupervised Semantic Parsing via Synchronous Semantic Decoding

Semantic parsing is challenging due to the structure gap and the semanti...
research
04/15/2021

Zero-Shot Cross-lingual Semantic Parsing

Recent work in crosslingual semantic parsing has successfully applied ma...
research
10/15/2021

On The Ingredients of an Effective Zero-shot Semantic Parser

Semantic parsers map natural language utterances into meaning representa...
research
01/24/2023

Low-Resource Compositional Semantic Parsing with Concept Pretraining

Semantic parsing plays a key role in digital voice assistants such as Al...
research
09/06/2021

Finding needles in a haystack: Sampling Structurally-diverse Training Sets from Synthetic Data for Compositional Generalization

Modern semantic parsers suffer from two principal limitations. First, tr...

Please sign up or login with your details

Forgot password? Click here to reset