Emergence of Machine Language: Towards Symbolic Intelligence with Neural Networks

01/14/2022
by   Yuqi Wang, et al.
13

Representation is a core issue in artificial intelligence. Humans use discrete language to communicate and learn from each other, while machines use continuous features (like vector, matrix, or tensor in deep neural networks) to represent cognitive patterns. Discrete symbols are low-dimensional, decoupled, and have strong reasoning ability, while continuous features are high-dimensional, coupled, and have incredible abstracting capabilities. In recent years, deep learning has developed the idea of continuous representation to the extreme, using millions of parameters to achieve high accuracies. Although this is reasonable from the statistical perspective, it has other major problems like lacking interpretability, poor generalization, and is easy to be attacked. Since both paradigms have strengths and weaknesses, a better choice is to seek reconciliation. In this paper, we make an initial attempt towards this direction. Specifically, we propose to combine symbolism and connectionism principles by using neural networks to derive a discrete representation. This process is highly similar to human language, which is a natural combination of discrete symbols and neural systems, where the brain processes continuous signals and represents intelligence via discrete language. To mimic this functionality, we denote our approach as machine language. By designing an interactive environment and task, we demonstrated that machines could generate a spontaneous, flexible, and semantic language through cooperation. Moreover, through experiments we show that discrete language representation has several advantages compared with continuous feature representation, from the aspects of interpretability, generalization, and robustness.

READ FULL TEXT

page 1

page 2

page 4

page 5

page 6

page 8

page 9

research
04/13/2023

Emergence of Symbols in Neural Networks for Semantic Understanding and Communication

The capacity to generate meaningful symbols and effectively employ them ...
research
01/04/2022

Discrete and continuous representations and processing in deep learning: Looking forward

Discrete and continuous representations of content (e.g., of language or...
research
02/12/2020

The Unreasonable Effectiveness of Deep Learning in Artificial Intelligence

Deep learning networks have been trained to recognize speech, caption ph...
research
02/26/2020

A neural network model of perception and reasoning

How perception and reasoning arise from neuronal network activity is poo...
research
03/06/2023

Symbolic Synthesis of Neural Networks

Neural networks adapt very well to distributed and continuous representa...
research
02/02/2020

A Machine Consciousness architecture based on Deep Learning and Gaussian Processes

Recent developments in machine learning have pushed the tasks that machi...
research
03/14/2019

Deep Switch Networks for Generating Discrete Data and Language

Multilayer switch networks are proposed as artificial generators of high...

Please sign up or login with your details

Forgot password? Click here to reset