A Globally Normalized Neural Model for Semantic Parsing

06/07/2021
by   Chenyang Huang, et al.
7

In this paper, we propose a globally normalized model for context-free grammar (CFG)-based semantic parsing. Instead of predicting a probability, our model predicts a real-valued score at each step and does not suffer from the label bias problem. Experiments show that our approach outperforms locally normalized models on small datasets, but it does not yield improvement on a large dataset.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/19/2016

Globally Normalized Transition-Based Neural Networks

We introduce a globally normalized transition-based neural network model...
research
05/22/2020

Investigating Label Bias in Beam Search for Open-ended Text Generation

Beam search is an effective and widely used decoding algorithm in many s...
research
06/11/2018

Straight to the Tree: Constituency Parsing with Neural Syntactic Distance

In this work, we propose a novel constituency parsing scheme. The model ...
research
06/11/2016

Data Recombination for Neural Semantic Parsing

Modeling crisp logical regularities is crucial in semantic parsing, maki...
research
12/19/2014

Supertagging: Introduction, learning, and application

Supertagging is an approach originally developed by Bangalore and Joshi ...
research
02/03/2020

How Far are We from Effective Context Modeling ? An Exploratory Study on Semantic Parsing in Context

Recently semantic parsing in context has received a considerable attenti...

Please sign up or login with your details

Forgot password? Click here to reset