The Interaction of Memory and Attention in Novel Word Generalization: A Computational Investigation

02/18/2016
by   Erin Grant, et al.
0

People exhibit a tendency to generalize a novel noun to the basic-level in a hierarchical taxonomy -- a cognitively salient category such as "dog" -- with the degree of generalization depending on the number and type of exemplars. Recently, a change in the presentation timing of exemplars has also been shown to have an effect, surprisingly reversing the prior observed pattern of basic-level generalization. We explore the precise mechanisms that could lead to such behavior by extending a computational model of word learning and word generalization to integrate cognitive processes of memory and attention. Our results show that the interaction of forgetting and attention to novelty, as well as sensitivity to both type and token frequencies of exemplars, enables the model to replicate the empirical results from different presentation timings. Our results reinforce the need to incorporate general cognitive processes within word learning models to better understand the range of observed behaviors in vocabulary acquisition.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/23/2020

Adaptive Forgetting Curves for Spaced Repetition Language Learning

The forgetting curve has been extensively explored by psychologists, edu...
research
01/04/2016

Artwork creation by a cognitive architecture integrating computational creativity and dual process approaches

The paper proposes a novel cognitive architecture (CA) for computational...
research
11/04/2021

How Do Neural Sequence Models Generalize? Local and Global Context Cues for Out-of-Distribution Prediction

After a neural sequence model encounters an unexpected token, can its be...
research
06/09/2023

Word sense extension

Humans often make creative use of words to express novel senses. A long-...
research
04/04/2019

Neural Models of the Psychosemantics of `Most'

How are the meanings of linguistic expressions related to their use in c...
research
08/02/2022

Lost in Space Marking

We look at a decision taken early in training a subword tokenizer, namel...
research
06/14/2023

Training-free Diffusion Model Adaptation for Variable-Sized Text-to-Image Synthesis

Diffusion models (DMs) have recently gained attention with state-of-the-...

Please sign up or login with your details

Forgot password? Click here to reset