Neural Semantic Parsing over Multiple Knowledge-bases

02/06/2017
by   Jonathan Herzig, et al.
0

A fundamental challenge in developing semantic parsers is the paucity of strong supervision in the form of language utterances annotated with logical form. In this paper, we propose to exploit structural regularities in language in different domains, and train semantic parsers over multiple knowledge-bases (KBs), while sharing information across datasets. We find that we can substantially improve parsing accuracy by training a single sequence-to-sequence model over multiple KBs, when providing an encoding of the domain at decoding time. Our model achieves state-of-the-art performance on the Overnight dataset (containing eight domains), improves performance over a single KB baseline from 75.6 number of model parameters.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/12/2019

Unified Semantic Parsing with Weak Supervision

Semantic parsing over multiple knowledge bases enables a parser to explo...
research
04/20/2017

Cross-domain Semantic Parsing via Paraphrasing

Existing studies on semantic parsing mainly focus on the in-domain setti...
research
06/11/2016

Data Recombination for Neural Semantic Parsing

Modeling crisp logical regularities is crucial in semantic parsing, maki...
research
12/23/2014

Grammar as a Foreign Language

Syntactic constituency parsing is a fundamental problem in natural langu...
research
01/26/2021

Graphonomy: Universal Image Parsing via Graph Reasoning and Transfer

Prior highly-tuned image parsing models are usually studied in a certain...
research
09/02/2019

A Sketch-Based System for Semantic Parsing

This paper presents our semantic parsing system for the evaluation task ...
research
09/28/2017

Soft Correspondences in Multimodal Scene Parsing

Exploiting multiple modalities for semantic scene parsing has been shown...

Please sign up or login with your details

Forgot password? Click here to reset