TreePiece: Faster Semantic Parsing via Tree Tokenization

03/30/2023
by   Sid Wang, et al.
4

Autoregressive (AR) encoder-decoder neural networks have proved successful in many NLP problems, including Semantic Parsing – a task that translates natural language to machine-readable parse trees. However, the sequential prediction process of AR models can be slow. To accelerate AR for semantic parsing, we introduce a new technique called TreePiece that tokenizes a parse tree into subtrees and generates one subtree per decoding step. On TopV2 benchmark, TreePiece shows 4.6 times faster decoding speed than standard AR, and comparable speed but significantly higher accuracy compared to Non-Autoregressive (NAR).

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/07/2020

Improving Fluency of Non-Autoregressive Machine Translation

Non-autoregressive (nAR) models for machine translation (MT) manifest su...
research
10/23/2020

SmBoP: Semi-autoregressive Bottom-up Semantic Parsing

The de-facto standard decoding method for semantic parsing in recent yea...
research
07/20/2022

Diffsound: Discrete Diffusion Model for Text-to-sound Generation

Generating sound effects that humans want is an important topic. However...
research
07/23/2023

Context Perception Parallel Decoder for Scene Text Recognition

Scene text recognition (STR) methods have struggled to attain high accur...
research
11/11/2022

Helping the Weak Makes You Strong: Simple Multi-Task Learning Improves Non-Autoregressive Translators

Recently, non-autoregressive (NAR) neural machine translation models hav...
research
04/14/2022

Improving Top-K Decoding for Non-Autoregressive Semantic Parsing via Intent Conditioning

Semantic parsing (SP) is a core component of modern virtual assistants l...
research
03/18/2019

Autoregressive Models for Sequences of Graphs

This paper proposes an autoregressive (AR) model for sequences of graphs...

Please sign up or login with your details

Forgot password? Click here to reset