Ordered Memory Baselines

02/08/2023
by   Daniel Borisov, et al.
0

Natural language semantics can be modeled using the phrase-structured model, which can be represented using a tree-type architecture. As a result, recent advances in natural language processing have been made utilising recursive neural networks using memory models that allow them to infer tree-type representations of the input sentence sequence. These new tree models have allowed for improvements in sentiment analysis and semantic recognition. Here we review the Ordered Memory model proposed by Shen et al. (2019) at the NeurIPS 2019 conference, and try to either create baselines that can perform better or create simpler models that can perform equally as well. We found that the Ordered Memory model performs on par with the state-of-the-art models used in tree-type modelling, and performs better than simplified baselines that require fewer parameters.

READ FULL TEXT
research
10/29/2019

Ordered Memory

Stack-augmented recurrent neural networks (RNNs) have been of interest t...
research
07/10/2017

Learning to Compose Task-Specific Tree Structures

For years, recursive neural networks (RvNNs) have been shown to be suita...
research
09/07/2018

Dynamic Compositionality in Recursive Neural Networks with Structure-aware Tag Representations

Most existing recursive neural network (RvNN) architectures utilize only...
research
04/11/2021

Unsupervised Learning of Explainable Parse Trees for Improved Generalisation

Recursive neural networks (RvNN) have been shown useful for learning sen...
research
07/20/2023

Efficient Beam Tree Recursion

Beam Tree Recursive Neural Network (BT-RvNN) was recently proposed as a ...
research
09/24/2018

Text Summarization as Tree Transduction by Top-Down TreeLSTM

Extractive compression is a challenging natural language processing prob...

Please sign up or login with your details

Forgot password? Click here to reset