Modeling Human Sentence Processing with Left-Corner Recurrent Neural Network Grammars

09/10/2021
by   Ryo Yoshida, et al.
0

In computational linguistics, it has been shown that hierarchical structures make language models (LMs) more human-like. However, the previous literature has been agnostic about a parsing strategy of the hierarchical models. In this paper, we investigated whether hierarchical structures make LMs more human-like, and if so, which parsing strategy is most cognitively plausible. In order to address this question, we evaluated three LMs against human reading times in Japanese with head-final left-branching structures: Long Short-Term Memory (LSTM) as a sequential model and Recurrent Neural Network Grammars (RNNGs) with top-down and left-corner parsing strategies as hierarchical models. Our computational modeling demonstrated that left-corner RNNGs outperformed top-down RNNGs and LSTM, suggesting that hierarchical and left-corner architectures are more cognitively plausible than top-down or sequential architectures. In addition, the relationships between the cognitive plausibility and (i) perplexity, (ii) parsing, and (iii) beam size will also be discussed.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/31/2015

Top-down Tree Long Short-Term Memory Networks

Long Short-Term Memory (LSTM) networks, a type of recurrent neural netwo...
research
04/07/2016

Geometric Scene Parsing with Hierarchical LSTM

This paper addresses the problem of geometric scene parsing, i.e. simult...
research
10/22/2018

Ordered Neurons: Integrating Tree Structures into Recurrent Neural Networks

Recurrent neural network (RNN) models are widely used for processing seq...
research
12/14/2017

A Hierarchical Recurrent Neural Network for Symbolic Melody Generation

In recent years, neural networks have been used to generate music pieces...
research
11/15/2017

A Sequential Neural Encoder with Latent Structured Description for Modeling Sentences

In this paper, we propose a sequential neural encoder with latent struct...
research
01/12/2023

LiteLSTM Architecture Based on Weights Sharing for Recurrent Neural Networks

Long short-term memory (LSTM) is one of the robust recurrent neural netw...
research
02/20/2018

Attentive Tensor Product Learning for Language Generation and Grammar Parsing

This paper proposes a new architecture - Attentive Tensor Product Learni...

Please sign up or login with your details

Forgot password? Click here to reset