DeepAI
Log In Sign Up

Learning compositional structures for semantic graph parsing

06/08/2021
by   Jonas Groschwitz, et al.
0

AM dependency parsing is a method for neural semantic graph parsing that exploits the principle of compositionality. While AM dependency parsers have been shown to be fast and accurate across several graphbanks, they require explicit annotations of the compositional tree structures for training. In the past, these were obtained using complex graphbank-specific heuristics written by experts. Here we show how they can instead be trained directly on the graphs with a neural latent-variable model, drastically reducing the amount and complexity of manual heuristics. We demonstrate that our model picks up on several linguistic phenomena on its own and achieves comparable accuracy to supervised training, greatly facilitating the use of AM dependency parsing for new sembanks.

READ FULL TEXT

page 1

page 2

page 3

page 4

05/29/2018

AMR Dependency Parsing with a Typed Semantic Algebra

We present a semantic parser for Abstract Meaning Representations which ...
04/17/2018

Learning Joint Semantic Parsers from Disjoint Data

We present a new approach to learning semantic parsers from multiple dat...
02/06/2018

Non-Projective Dependency Parsing via Latent Heads Representation (LHR)

In this paper, we introduce a novel approach based on a bidirectional re...
01/19/2020

Learning Compositional Neural Information Fusion for Human Parsing

This work proposes to combine neural networks with the compositional hie...
04/29/2020

Normalizing Compositional Structures Across Graphbanks

The emergence of a variety of graph-based meaning representations (MRs) ...
08/09/2015

An Automatic Machine Translation Evaluation Metric Based on Dependency Parsing Model

Most of the syntax-based metrics obtain the similarity by comparing the ...
09/17/2013

Lambda Dependency-Based Compositional Semantics

This short note presents a new formal language, lambda dependency-based ...