Language acquisition: do children and language models follow similar learning stages?

06/06/2023
by   Linnea Evanson, et al.
0

During language acquisition, children follow a typical sequence of learning stages, whereby they first learn to categorize phonemes before they develop their lexicon and eventually master increasingly complex syntactic structures. However, the computational principles that lead to this learning trajectory remain largely unknown. To investigate this, we here compare the learning trajectories of deep language models to those of children. Specifically, we test whether, during its training, GPT-2 exhibits stages of language acquisition comparable to those observed in children aged between 18 months and 6 years. For this, we train 48 GPT-2 models from scratch and evaluate their syntactic and semantic abilities at each training step, using 96 probes curated from the BLiMP, Zorro and BIG-Bench benchmarks. We then compare these evaluations with the behavior of 54 children during language production. Our analyses reveal three main findings. First, similarly to children, the language models tend to learn linguistic skills in a systematic order. Second, this learning scheme is parallel: the language tasks that are learned last improve from the very first training steps. Third, some - but not all - learning stages are shared between children and these language models. Overall, these results shed new light on the principles of language acquisition, and highlight important divergences in how humans and modern algorithms learn to process natural language.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/05/2021

Word Acquisition in Neural Language Models

We investigate how neural language models acquire individual words durin...
research
07/25/2023

Trustworthiness of Children Stories Generated by Large Language Models

Large Language Models (LLMs) have shown a tremendous capacity for genera...
research
02/20/2019

Emulating Human Developmental Stages with Bayesian Neural Networks

We compare the acquisition of knowledge in humans and machines. Research...
research
09/15/2018

Neural Networks and Quantifier Conservativity: Does Data Distribution Affect Learnability?

All known natural language determiners are conservative. Psycholinguisti...
research
05/14/2018

Word learning and the acquisition of syntactic--semantic overhypotheses

Children learning their first language face multiple problems of inducti...
research
04/22/2020

Syntactic Structure from Deep Learning

Modern deep neural networks achieve impressive performance in engineerin...
research
09/13/2021

The Grammar-Learning Trajectories of Neural Language Models

The learning trajectories of linguistic phenomena provide insight into t...

Please sign up or login with your details

Forgot password? Click here to reset