Text Summarization as Tree Transduction by Top-Down TreeLSTM

09/24/2018
by   Davide Bacciu, et al.
0

Extractive compression is a challenging natural language processing problem. This work contributes by formulating neural extractive compression as a parse tree transduction problem, rather than a sequence transduction task. Motivated by this, we introduce a deep neural model for learning structure-to-substructure tree transductions by extending the standard Long Short-Term Memory, considering the parent-child relationships in the structural recursion. The proposed model can achieve state of the art performance on sentence compression benchmarks, both in terms of accuracy and compression rate.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/31/2015

Top-down Tree Long Short-Term Memory Networks

Long Short-Term Memory (LSTM) networks, a type of recurrent neural netwo...
research
07/10/2017

Learning to Compose Task-Specific Tree Structures

For years, recursive neural networks (RvNNs) have been shown to be suita...
research
06/05/2020

Sentence Compression as Deletion with Contextual Embeddings

Sentence compression is the task of creating a shorter version of an inp...
research
02/13/2019

Sentence Compression via DC Programming Approach

Sentence compression is an important problem in natural language process...
research
02/05/2019

Deep Tree Transductions - A Short Survey

The paper surveys recent extensions of the Long-Short Term Memory networ...
research
02/02/2020

A Difference-of-Convex Programming Approach With Parallel Branch-and-Bound For Sentence Compression Via A Hybrid Extractive Model

Sentence compression is an important problem in natural language process...
research
02/08/2023

Ordered Memory Baselines

Natural language semantics can be modeled using the phrase-structured mo...

Please sign up or login with your details

Forgot password? Click here to reset