A Fast Unified Model for Parsing and Sentence Understanding

03/19/2016
by   Samuel R. Bowman, et al.
0

Tree-structured neural networks exploit valuable syntactic parse information as they interpret the meanings of sentences. However, they suffer from two key technical problems that make them slow and unwieldy for large-scale NLP tasks: they usually operate on parsed sentences and they do not directly support batched computation. We address these issues by introducing the Stack-augmented Parser-Interpreter Neural Network (SPINN), which combines parsing and interpretation within a single tree-sequence hybrid model by integrating tree-structured sentence interpretation into the linear sequential structure of a shift-reduce parser. Our model supports batched computation for a speedup of up to 25 times over other tree-structured models, and its integrated parser can operate on unparsed data with little loss in accuracy. We evaluate it on the Stanford NLI entailment task and show that it significantly outperforms other sentence-encoding models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/06/2018

Top-down Tree Structured Decoding with Syntactic Connections for Neural Machine Translation and Parsing

The addition of syntax-aware decoding in Neural Machine Translation (NMT...
research
12/02/2016

Shift-Reduce Constituent Parsing with Neural Lookahead Features

Transition-based models can be fast and accurate for constituent parsing...
research
05/17/2018

Linear-Time Constituency Parsing with RNNs and Dynamic Programming

Recently, span-based constituency parsing has achieved competitive accur...
research
02/23/2023

Prosodic features improve sentence segmentation and parsing

Parsing spoken dialogue presents challenges that parsing text does not, ...
research
03/20/2019

Left-to-Right Dependency Parsing with Pointer Networks

We propose a novel transition-based algorithm that straightforwardly par...
research
03/01/2022

Fast-R2D2: A Pretrained Recursive Neural Network based on Pruned CKY for Grammar Induction and Text Representation

Recently CKY-based models show great potential in unsupervised grammar i...
research
06/19/2015

Structured Training for Neural Network Transition-Based Parsing

We present structured perceptron training for neural network transition-...

Please sign up or login with your details

Forgot password? Click here to reset