On the Turing Completeness of Modern Neural Network Architectures

01/10/2019
by   Jorge Pérez, et al.
0

Alternatives to recurrent neural networks, in particular, architectures based on attention or convolutions, have been gaining momentum for processing input sequences. In spite of their relevance, the computational properties of these alternatives have not yet been fully explored. We study the computational power of two of the most paradigmatic architectures exemplifying these mechanisms: the Transformer (Vaswani et al., 2017) and the Neural GPU (Kaiser & Sutskever, 2016). We show both models to be Turing complete exclusively based on their capacity to compute and access internal dense representations of the data. In particular, neither the Transformer nor the Neural GPU requires access to an external memory to become Turing complete. Our study also reveals some minimal sets of elements needed to obtain these completeness results.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/26/2023

On the Computational Power of Decoder-Only Transformer Language Models

This article presents a theoretical evaluation of the computational univ...
research
08/12/2015

Syntax Evolution: Problems and Recursion

Why did only we humans evolve Turing completeness? Turing completeness i...
research
11/16/2022

Token Turing Machines

We propose Token Turing Machines (TTM), a sequential, autoregressive Tra...
research
06/16/2020

On the Computational Power of Transformers and Its Implications in Sequence Modeling

Transformers are being used extensively across several sequence modeling...
research
11/09/2016

Lie-Access Neural Turing Machines

External neural memory structures have recently become a popular tool fo...
research
12/01/2014

Problem Theory

The Turing machine, as it was presented by Turing himself, models the ca...
research
02/28/2016

Lie Access Neural Turing Machine

Following the recent trend in explicit neural memory structures, we pres...

Please sign up or login with your details

Forgot password? Click here to reset