Distributed neural encoding of binding to thematic roles

10/24/2021
by   Matthias Lalisse, et al.
0

A framework and method are proposed for the study of constituent composition in fMRI. The method produces estimates of neural patterns encoding complex linguistic structures, under the assumption that the contributions of individual constituents are additive. Like usual techniques for modeling compositional structure in fMRI, the proposed method employs pattern superposition to synthesize complex structures from their parts. Unlike these techniques, superpositions are sensitive to the structural positions of constituents, making them irreducible to structure-indiscriminate ("bag-of-words") models of composition. Reanalyzing data from a study by Frankland and Greene (2015), it is shown that comparison of neural predictive models with differing specifications can illuminate aspects of neural representational contents that are not apparent when composition is not modelled. The results indicate that the neural instantiations of the binding of fillers to thematic roles in a sentence are non-orthogonal, and therefore spatially overlapping.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/13/2018

fMRI Semantic Category Decoding using Linguistic Encoding of Word Embeddings

The dispute of how the human brain represents conceptual knowledge has b...
research
06/29/2020

A shared neural encoding model for the prediction of subject-specific fMRI response

The increasing popularity of naturalistic paradigms in fMRI (such as mov...
research
09/11/2018

Assessing Composition in Sentence Vector Representations

An important component of achieving language understanding is mastering ...
research
04/01/2016

A Compositional Approach to Language Modeling

Traditional language models treat language as a finite state automaton o...
research
09/12/2018

Neural Melody Composition from Lyrics

In this paper, we study a novel task that learns to compose music from n...
research
12/20/2018

RNNs Implicitly Implement Tensor Product Representations

Recurrent neural networks (RNNs) can learn continuous vector representat...
research
07/23/2017

Composing Distributed Representations of Relational Patterns

Learning distributed representations for relation instances is a central...

Please sign up or login with your details

Forgot password? Click here to reset