Towards a neural architecture of language: Deep learning versus logistics of access in neural architectures for compositional processing

10/19/2022
by   Frank van der Velde, et al.
0

Recently, a number of articles have argued that deep learning models such as GPT could also capture key aspects of language processing in the human mind and brain. However, I will argue that these models are not suitable as neural models of human language. Firstly, because they fail on fundamental boundary conditions, such as the amount of learning they require. This would in fact imply that the mechanisms of GPT and brain language processing are fundamentally different. Secondly, because they do not possess the logistics of access needed for compositional and productive human language processing. Neural architectures could possess logistics of access based on small-world like network structures, in which processing does not consist of symbol manipulation but of controlling the flow of activation. In this view, two complementary approaches would be needed to investigate the relation between brain and cognition. Investigating learning methods could reveal how 'learned cognition' as found in deep learning could develop in the brain. However, neural architectures with logistics of access should also be developed to account for 'productive cognition' as required for natural or artificial human language processing. Later on, these approaches could perhaps be combined to see how such architectures could develop by learning and development from a simpler basis.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/23/2022

Context Limitations Make Neural Language Models More Human-Like

Do modern natural language processing (NLP) models exhibit human-like la...
research
04/22/2019

Compositional generalization in a deep seq2seq model by separating syntax and semantics

Standard methods in deep learning for natural language processing fail t...
research
05/14/2021

Visual analogy: Deep learning versus compositional models

Is analogical reasoning a task that must be learned to solve from scratc...
research
05/12/2019

The relational processing limits of classic and contemporary neural network models of language processing

The ability of neural networks to capture relational knowledge is a matt...
research
11/28/2021

Long-range and hierarchical language predictions in brains and algorithms

Deep learning has recently made remarkable progress in natural language ...
research
03/24/2017

Interactive Natural Language Acquisition in a Multi-modal Recurrent Neural Architecture

The human brain is one of the most complex dynamic systems that enables ...
research
09/08/2023

Meta predictive learning model of natural languages

Large language models based on self-attention mechanisms have achieved a...

Please sign up or login with your details

Forgot password? Click here to reset