Information Minimization In Emergent Languages

05/31/2019
by   Eugene Kharitonov, et al.
0

There is a growing interest in studying the languages emerging when neural agents are jointly trained to solve tasks that require communication through discrete messages. We investigate here the information-theoretic complexity of such languages, focusing on the most basic two-agent, one-symbol, one-exchange setup. We find that, under common training procedures, the emergent languages are subject to an information minimization pressure: The mutual information between the communicating agent's inputs and the messages is close to the minimum that still allows the task to be solved. After verifying this information minimization property, we perform experiments showing that a stronger discrete-channel-driven information minimization pressure leads to increased robustness to overfitting and to adversarial attacks. We conclude by discussing the implications of our findings for the studies of artificial and natural language emergence, and for representation learning.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/07/2022

Learning to Ground Decentralized Multi-Agent Communication with Contrastive Learning

For communication to happen successfully, a common language is required ...
research
10/05/2020

InfoBERT: Improving Robustness of Language Models from An Information Theoretic Perspective

Large-scale language models such as BERT have achieved state-of-the-art ...
research
10/11/2019

Learning Nearly Decomposable Value Functions Via Communication Minimization

Reinforcement learning encounters major challenges in multi-agent settin...
research
04/08/2020

Internal and external pressures on language emergence: least effort, object constancy and frequency

In previous work, artificial agents were shown to achieve almost perfect...
research
02/19/2019

On Voting Strategies and Emergent Communication

Humans use language to collectively execute complex strategies in additi...
research
06/30/2022

Towards Human-Agent Communication via the Information Bottleneck Principle

Emergent communication research often focuses on optimizing task-specifi...
research
05/11/2019

Semantic categories of artifacts and animals reflect efficient coding

It has been argued that semantic categories across languages reflect pre...

Please sign up or login with your details

Forgot password? Click here to reset