A Neural Model of Adaptation in Reading

08/29/2018
by   Marten van Schijndel, et al.
0

It has been argued that humans rapidly adapt their lexical and syntactic expectations to match the statistics of the current linguistic context. We provide further support to this claim by showing that the addition of a simple adaptation mechanism to a neural language model improves our predictions of human reading times compared to a non-adaptive model. We analyze the performance of the model on controlled materials from psycholinguistic experiments and show that it adapts not only to lexical items but also to abstract syntactic structures.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/02/2020

On the Predictive Power of Neural Language Models for Human Real-Time Comprehension Behavior

Human reading behavior is tuned to the statistics of natural language: t...
research
06/01/2018

Some of Them Can be Guessed! Exploring the Effect of Linguistic Context in Predicting Quantifiers

We study the role of linguistic context in predicting quantifiers (`few'...
research
10/21/2022

Syntactic Surprisal From Neural Models Predicts, But Underestimates, Human Processing Difficulty From Syntactic Ambiguities

Humans exhibit garden path effects: When reading sentences that are temp...
research
10/25/2022

Dual Mechanism Priming Effects in Hindi Word Order

Word order choices during sentence production can be primed by preceding...
research
02/28/2023

Information-Restricted Neural Language Models Reveal Different Brain Regions' Sensitivity to Semantics, Syntax and Context

A fundamental question in neurolinguistics concerns the brain regions in...
research
04/20/2020

Adaptation of a Lexical Organization for Social Engineering Detection and Response Generation

We present a paradigm for extensible lexicon development based on Lexica...

Please sign up or login with your details

Forgot password? Click here to reset