Learning word-referent mappings and concepts from raw inputs

03/12/2020
by   Wai Keen Vong, et al.
0

How do children learn correspondences between the language and the world from noisy, ambiguous, naturalistic input? One hypothesis is via cross-situational learning: tracking words and their possible referents across multiple situations allows learners to disambiguate correct word-referent mappings (Yu Smith, 2007). However, previous models of cross-situational word learning operate on highly simplified representations, side-stepping two important aspects of the actual learning problem. First, how can word-referent mappings be learned from raw inputs such as images? Second, how can these learned mappings generalize to novel instances of a known word? In this paper, we present a neural network model trained from scratch via self-supervision that takes in raw images and words as inputs, and show that it can learn word-referent mappings from fully ambiguous scenes and utterances through cross-situational learning. In addition, the model generalizes to novel word instances, locates referents of words in a scene, and shows a preference for mutual exclusivity.

READ FULL TEXT

page 2

page 4

page 6

research
12/06/2020

Competition in Cross-situational Word Learning: A Computational Study

Children learn word meanings by tapping into the commonalities across di...
research
11/02/2018

Unsupervised Hyperalignment for Multilingual Word Embeddings

We consider the problem of aligning continuous word representations, lea...
research
05/20/2019

A Neural Network Architecture for Learning Word-Referent Associations in Multiple Contexts

This article proposes a biologically inspired neurocomputational archite...
research
06/04/2020

A Computational Model of Early Word Learning from the Infant's Point of View

Human infants have the remarkable ability to learn the associations betw...
research
02/22/2017

Calculating Probabilities Simplifies Word Learning

Children can use the statistical regularities of their environment to le...
research
04/08/2020

Which one is the dax? Achieving mutual exclusivity with neural networks

Learning words is a challenge for children and neural networks alike. Ho...
research
09/28/2018

Cross-situational learning of large lexicons with finite memory

Cross-situational word learning, wherein a learner combines information ...

Please sign up or login with your details

Forgot password? Click here to reset