What makes a language easy to deep-learn?

02/23/2023
by   Lukas Galke, et al.
0

Neural networks drive the success of natural language processing. A fundamental property of natural languages is their compositional structure, allowing us to describe new meanings systematically. However, neural networks notoriously struggle with systematic generalization and do not necessarily benefit from a compositional structure in emergent communication simulations. Here, we test how neural networks compare to humans in learning and generalizing a new language. We do this by closely replicating an artificial language learning study (conducted originally with human participants) and evaluating the memorization and generalization capabilities of deep neural networks with respect to the degree of structure in the input language. Our results show striking similarities between humans and deep neural networks: More structured linguistic input leads to more systematic generalization and better convergence between humans and neural network agents and between different neural agents. We then replicate this structure bias found in humans and our recurrent neural networks with a Transformer-based large language model (GPT-3), showing a similar benefit for structured linguistic input regarding generalization systematicity and memorization errors. These findings show that the underlying structure of languages is crucial for systematic generalization. Due to the correlation between community size and linguistic structure in natural languages, our findings underscore the challenge of automated processing of low-resource languages. Nevertheless, the similarity between humans and machines opens new avenues for language evolution research.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/30/2019

Linguistic generalization and compositionality in modern artificial neural networks

In the last decade, deep artificial neural networks have achieved astoun...
research
04/22/2022

Emergent Communication for Understanding Human Language Evolution: What's Missing?

Emergent communication protocols among humans and artificial neural netw...
research
04/20/2020

Compositionality and Generalization in Emergent Languages

Natural language allows us to refer to novel composite concepts by combi...
research
09/11/2018

What can linguistics and deep learning contribute to each other?

Joe Pater's target article calls for greater interaction between neural ...
research
10/28/2022

Modeling structure-building in the brain with CCG parsing and large language models

To model behavioral and neural correlates of language comprehension in n...
research
02/20/2022

Understanding Robust Generalization in Learning Regular Languages

A key feature of human intelligence is the ability to generalize beyond ...
research
05/29/2019

Word-order biases in deep-agent emergent communication

Sequence-processing neural networks led to remarkable progress on many N...

Please sign up or login with your details

Forgot password? Click here to reset