Discrete and continuous representations and processing in deep learning: Looking forward

01/04/2022
by   Ruben Cartuyvels, et al.
0

Discrete and continuous representations of content (e.g., of language or images) have interesting properties to be explored for the understanding of or reasoning with this content by machines. This position paper puts forward our opinion on the role of discrete and continuous representations and their processing in the deep learning field. Current neural network models compute continuous-valued data. Information is compressed into dense, distributed embeddings. By stark contrast, humans use discrete symbols in their communication with language. Such symbols represent a compressed version of the world that derives its meaning from shared contextual information. Additionally, human reasoning involves symbol manipulation at a cognitive level, which facilitates abstract reasoning, the composition of knowledge and understanding, generalization and efficient learning. Motivated by these insights, in this paper we argue that combining discrete and continuous representations and their processing will be essential to build systems that exhibit a general form of intelligence. We suggest and discuss several avenues that could improve current neural networks with the inclusion of discrete elements to combine the advantages of both types of representations.

READ FULL TEXT
research
01/14/2022

Emergence of Machine Language: Towards Symbolic Intelligence with Neural Networks

Representation is a core issue in artificial intelligence. Humans use di...
research
08/24/2022

Deep Symbolic Learning: Discovering Symbols and Rules from Perceptions

Neuro-Symbolic (NeSy) integration combines symbolic reasoning with Neura...
research
04/01/2021

Reconciling the Discrete-Continuous Divide: Towards a Mathematical Theory of Sparse Communication

Neural networks and other machine learning models compute continuous rep...
research
07/06/2021

Discrete-Valued Neural Communication

Deep learning has advanced from fully connected architectures to structu...
research
02/02/2017

Symbolic, Distributed and Distributional Representations for Natural Language Processing in the Era of Deep Learning: a Survey

Natural language and symbols are intimately correlated. Recent advances ...
research
05/18/2019

Human-like machine thinking: Language guided imagination

Human thinking requires the brain to understand the meaning of language ...
research
02/15/2023

Topological Neural Discrete Representation Learning à la Kohonen

Unsupervised learning of discrete representations from continuous ones i...

Please sign up or login with your details

Forgot password? Click here to reset