Word learning and the acquisition of syntactic--semantic overhypotheses

05/14/2018
by   Jon Gauthier, et al.
0

Children learning their first language face multiple problems of induction: how to learn the meanings of words, and how to build meaningful phrases from those words according to syntactic rules. We consider how children might solve these problems efficiently by solving them jointly, via a computational model that learns the syntax and semantics of multi-word utterances in a grounded reference game. We select a well-studied empirical case in which children are aware of patterns linking the syntactic and semantic properties of words --- that the properties picked out by base nouns tend to be related to shape, while prenominal adjectives tend to refer to other properties such as color. We show that children applying such inductive biases are accurately reflecting the statistics of child-directed speech, and that inducing similar biases in our computational model captures children's behavior in a classic adjective learning experiment. Our model incorporating such biases also demonstrates a clear data efficiency in learning, relative to a baseline model that learns without forming syntax-sensitive overhypotheses of word meaning. Thus solving a more complex joint inference problem may make the full problem of language acquisition easier, not harder.

READ FULL TEXT
research
11/27/2016

The polysemy of the words that children learn over time

Here we study polysemy as a potential learning bias in vocabulary learni...
research
06/06/2023

Language acquisition: do children and language models follow similar learning stages?

During language acquisition, children follow a typical sequence of learn...
research
06/08/2022

Syntactic Inductive Biases for Deep Learning Methods

In this thesis, we try to build a connection between the two schools by ...
research
05/10/2021

Language Acquisition is Embodied, Interactive, Emotive: a Research Proposal

Humans' experience of the world is profoundly multimodal from the beginn...
research
10/06/2020

LSTMs Compose (and Learn) Bottom-Up

Recent work in NLP shows that LSTM language models capture hierarchical ...
research
06/15/2022

How Adults Understand What Young Children Say

Children's early speech often bears little resemblance to adult speech i...
research
02/22/2017

Calculating Probabilities Simplifies Word Learning

Children can use the statistical regularities of their environment to le...

Please sign up or login with your details

Forgot password? Click here to reset