Combining Improvements for Exploiting Dependency Trees in Neural Semantic Parsing

12/25/2021
by   Defeng Xie, et al.
0

The dependency tree of a natural language sentence can capture the interactions between semantics and words. However, it is unclear whether those methods which exploit such dependency information for semantic parsing can be combined to achieve further improvement and the relationship of those methods when they combine. In this paper, we examine three methods to incorporate such dependency information in a Transformer based semantic parser and empirically study their combinations. We first replace standard self-attention heads in the encoder with parent-scaled self-attention (PASCAL) heads, i.e., the ones that can attend to the dependency parent of each token. Then we concatenate syntax-aware word representations (SAWRs), i.e., the intermediate hidden representations of a neural dependency parser, with ordinary word embedding to enhance the encoder. Later, we insert the constituent attention (CA) module to the encoder, which adds an extra constraint to attention heads that can better capture the inherent dependency structure of input sentences. Transductive ensemble learning (TEL) is used for model aggregation, and an ablation study is conducted to show the contribution of each method. Our experiments show that CA is complementary to PASCAL or SAWRs, and PASCAL + CA provides state-of-the-art performance among neural approaches on ATIS, GEO, and JOBS.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/01/2018

Dependency-based Hybrid Trees for Semantic Parsing

We propose a novel dependency-based hybrid tree model for semantic parsi...
research
03/11/2022

Integrating Dependency Tree Into Self-attention for Sentence Representation

Recent progress on parse tree encoder for sentence representation learni...
research
11/10/2019

Rethinking Self-Attention: An Interpretable Self-Attentive Encoder-Decoder Parser

Attention mechanisms have improved the performance of NLP tasks while pr...
research
09/18/2019

Improving Natural Language Inference with a Pretrained Parser

We introduce a novel approach to incorporate syntax into natural languag...
research
09/05/2019

Source Dependency-Aware Transformer with Supervised Self-Attention

Recently, Transformer has achieved the state-of-the-art performance on m...
research
11/16/2020

Text Information Aggregation with Centrality Attention

A lot of natural language processing problems need to encode the text se...
research
04/17/2018

Monte Carlo Syntax Marginals for Exploring and Using Dependency Parses

Dependency parsing research, which has made significant gains in recent ...

Please sign up or login with your details

Forgot password? Click here to reset