Can Peanuts Fall in Love with Distributional Semantics?

01/20/2023
by   James A. Michaelov, et al.
0

The context in which a sentence appears can drastically alter our expectations about upcoming words - for example, following a short story involving an anthropomorphic peanut, experimental participants are more likely to expect the sentence 'the peanut was in love' than 'the peanut was salted', as indexed by N400 amplitude (Nieuwland van Berkum, 2006). This rapid and dynamic updating of comprehenders' expectations about the kind of events that a peanut may take part in based on context has been explained using the construct of Situation Models - updated mental representations of key elements of an event under discussion, in this case, the peanut protagonist. However, recent work showing that N400 amplitude can be predicted based on distributional information alone raises the question whether situation models are in fact necessary for the kinds of contextual effects observed in previous work. To investigate this question, we attempt to model the results of Nieuwland and van Berkum (2006) using six computational language models and three sets of word vectors, none of which have explicit situation models or semantic grounding. We find that the effect found by Nieuwland and van Berkum (2006) can be fully modeled by two language models and two sets of word vectors, with others showing a reduced effect. Thus, at least some processing effects normally explained through situation models may not in fact require explicit situation models.

READ FULL TEXT

page 5

page 6

research
06/17/2019

A Structured Distributional Model of Sentence Meaning and Processing

Most compositional distributional semantic models represent sentence mea...
research
09/02/2021

So Cloze yet so Far: N400 Amplitude is Better Predicted by Distributional Information than Human Predictability Judgements

More predictable words are easier to process - they are read faster and ...
research
11/23/2021

Using Distributional Principles for the Semantic Study of Contextual Language Models

Many studies were recently done for investigating the properties of cont...
research
03/11/2022

When classifying grammatical role, BERT doesn't care about word order... except when it matters

Because meaning can often be inferred from lexical semantics alone, word...
research
10/03/2017

Is Structure Necessary for Modeling Argument Expectations in Distributional Semantics?

Despite the number of NLP studies dedicated to thematic fit estimation, ...
research
07/20/2017

High-risk learning: acquiring new word vectors from tiny data

Distributional semantics models are known to struggle with small data. I...
research
12/02/2022

Event knowledge in large language models: the gap between the impossible and the unlikely

People constantly use language to learn about the world. Computational l...

Please sign up or login with your details

Forgot password? Click here to reset