Semantic Parsing with Semi-Supervised Sequential Autoencoders

09/29/2016
by   Tomáš Kočiský, et al.
0

We present a novel semi-supervised approach for sequence transduction and apply it to semantic parsing. The unsupervised component is based on a generative model in which latent sentences generate the unpaired logical forms. We apply this method to a number of semantic parsing tasks focusing on domains with limited access to labelled training data and extend those datasets with synthetically generated logical forms.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/16/2015

Parsing Natural Language Sentences by Semi-supervised Methods

We present our work on semi-supervised parsing of natural language sente...
research
05/07/2023

Laziness Is a Virtue When It Comes to Compositionality in Neural Semantic Parsing

Nearly all general-purpose neural semantic parsers generate logical form...
research
05/27/2020

Self-Training for Unsupervised Parsing with PRPN

Neural unsupervised parsing (UP) models learn to parse without access to...
research
04/04/2018

Generative Visual Rationales

Interpretability and small labelled datasets are key issues in the pract...
research
05/21/2019

Generating Logical Forms from Graph Representations of Text and Entities

Structured information about entities is critical for many semantic pars...
research
06/20/2016

Unanimous Prediction for 100 Semantic Mappings

Can we train a system that, on any new input, either says "don't know" o...
research
07/25/2018

Differentiable Perturb-and-Parse: Semi-Supervised Parsing with a Structured Variational Autoencoder

Human annotation for syntactic parsing is expensive, and large resources...

Please sign up or login with your details

Forgot password? Click here to reset