DeepAI AI Chat
Log In Sign Up

Compositionality and Generalization in Emergent Languages

by   Rahma Chaabouni, et al.

Natural language allows us to refer to novel composite concepts by combining expressions denoting their parts according to systematic rules, a property known as compositionality. In this paper, we study whether the language emerging in deep multi-agent simulations possesses a similar ability to refer to novel primitive combinations, and whether it accomplishes this feat by strategies akin to human-language compositionality. Equipped with new ways to measure compositionality in emergent languages inspired by disentanglement in representation learning, we establish three main results. First, given sufficiently large input spaces, the emergent language will naturally develop the ability to refer to novel composite concepts. Second, there is no correlation between the degree of compositionality of an emergent language and its ability to generalize. Third, while compositionality is not necessary for generalization, it provides an advantage in terms of language transmission: The more compositional a language is, the more easily it will be picked up by new learners, even when the latter differ in architecture from the original agents. We conclude that compositionality does not arise from simple generalization pressure, but if an emergent language does chance upon it, it will be more likely to survive and thrive.


What makes a language easy to deep-learn?

Neural networks drive the success of natural language processing. A fund...

Natural Language Does Not Emerge 'Naturally' in Multi-Agent Dialog

A number of recent works have proposed techniques for end-to-end learnin...

Compositional properties of emergent languages in deep learning

Recent findings in multi-agent deep learning systems point towards the e...

Concept Generation in Language Evolution

This thesis investigates the generation of new concepts from combination...

Iterated learning for emergent systematicity in VQA

Although neural module networks have an architectural bias towards compo...

The Effect of Efficient Messaging and Input Variability on Neural-Agent Iterated Language Learning

Natural languages commonly display a trade-off among different strategie...

Learning Transductions to Test Systematic Compositionality

Recombining known primitive concepts into larger novel combinations is a...