Finding Syntax in Human Encephalography with Beam Search

06/11/2018
by   John Hale, et al.
0

Recurrent neural network grammars (RNNGs) are generative models of (tree,string) pairs that rely on neural networks to evaluate derivational choices. Parsing with them using beam search yields a variety of incremental complexity metrics such as word surprisal and parser action count. When used as regressors against human electrophysiological responses to naturalistic text, they derive two amplitude effects: an early peak and a P600-like later peak. By contrast, a non-syntactic neural language model yields no reliable effects. Model comparisons attribute the early peak to syntactic composition within the RNNG. This pattern of results recommends the RNNG+beam search combination as a mechanistic model of the syntactic processing that occurs during normal human language comprehension.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/02/2017

Neural Language Modeling by Jointly Learning Syntax and Lexicon

We propose a neural language model capable of unsupervised syntactic str...
research
05/31/2021

Effective Batching for Recurrent Neural Network Grammars

As a language model that integrates traditional symbolic operations and ...
research
09/12/2019

Speculative Beam Search for Simultaneous Translation

Beam search is universally used in full-sentence translation but its app...
research
11/17/2022

Probing for Incremental Parse States in Autoregressive Language Models

Next-word predictions from autoregressive neural language models show re...
research
08/07/2015

An End-to-End Neural Network for Polyphonic Piano Music Transcription

We present a supervised neural network model for polyphonic piano music ...
research
11/11/2018

Statistical modelling of conidial discharge of entomophthoralean fungi using a newly discovered Pandora species

Entomophthoralean fungi are insect pathogenic fungi and are characterize...
research
10/28/2022

Modeling structure-building in the brain with CCG parsing and large language models

To model behavioral and neural correlates of language comprehension in n...

Please sign up or login with your details

Forgot password? Click here to reset