Compositionality and Generalization in Emergent Languages

04/20/2020
by   Rahma Chaabouni, et al.
0

Natural language allows us to refer to novel composite concepts by combining expressions denoting their parts according to systematic rules, a property known as compositionality. In this paper, we study whether the language emerging in deep multi-agent simulations possesses a similar ability to refer to novel primitive combinations, and whether it accomplishes this feat by strategies akin to human-language compositionality. Equipped with new ways to measure compositionality in emergent languages inspired by disentanglement in representation learning, we establish three main results. First, given sufficiently large input spaces, the emergent language will naturally develop the ability to refer to novel composite concepts. Second, there is no correlation between the degree of compositionality of an emergent language and its ability to generalize. Third, while compositionality is not necessary for generalization, it provides an advantage in terms of language transmission: The more compositional a language is, the more easily it will be picked up by new learners, even when the latter differ in architecture from the original agents. We conclude that compositionality does not arise from simple generalization pressure, but if an emergent language does chance upon it, it will be more likely to survive and thrive.

READ FULL TEXT
research
02/23/2023

What makes a language easy to deep-learn?

Neural networks drive the success of natural language processing. A fund...
research
06/26/2017

Natural Language Does Not Emerge 'Naturally' in Multi-Agent Dialog

A number of recent works have proposed techniques for end-to-end learnin...
research
01/23/2020

Compositional properties of emergent languages in deep learning

Recent findings in multi-agent deep learning systems point towards the e...
research
01/25/2016

Concept Generation in Language Evolution

This thesis investigates the generation of new concepts from combination...
research
05/03/2021

Iterated learning for emergent systematicity in VQA

Although neural module networks have an architectural bias towards compo...
research
04/15/2021

The Effect of Efficient Messaging and Input Variability on Neural-Agent Iterated Language Learning

Natural languages commonly display a trade-off among different strategie...
research
08/17/2022

Learning Transductions to Test Systematic Compositionality

Recombining known primitive concepts into larger novel combinations is a...

Please sign up or login with your details

Forgot password? Click here to reset