On the Computational Power of Transformers and Its Implications in Sequence Modeling

06/16/2020
by   Satwik Bhattamishra, et al.
0

Transformers are being used extensively across several sequence modeling tasks. Significant research effort has been devoted to experimentally probe the inner workings of Transformers. However, our conceptual and theoretical understanding of their power and inherent limitations is still nascent. In particular, the roles of various components in Transformers such as positional encodings, attention heads, residual connections, and feedforward networks, are not clear. In this paper, we take a step towards answering these questions. We analyze the computational power as captured by Turing-completeness. We first provide an alternate proof to show that vanilla Transformers are Turing-complete and then we prove that Transformers with positional masking and without any positional encoding are also Turing-complete. We further analyze the necessity of each component for the Turing-completeness of the network; interestingly, we find that a particular type of residual connection is necessary. We demonstrate the practical implications of our results via experiments on machine translation and synthetic tasks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/12/2015

Syntax Evolution: Problems and Recursion

Why did only we humans evolve Turing completeness? Turing completeness i...
research
04/20/2023

Computational Power of Particle Methods

The computational power of a compute model determines the class of probl...
research
01/10/2019

On the Turing Completeness of Modern Neural Network Architectures

Alternatives to recurrent neural networks, in particular, architectures ...
research
04/29/2021

Analyzing the Nuances of Transformers' Polynomial Simplification Abilities

Symbolic Mathematical tasks such as integration often require multiple w...
research
05/26/2023

On the Computational Power of Decoder-Only Transformer Language Models

This article presents a theoretical evaluation of the computational univ...
research
02/20/2023

Deep Transformers without Shortcuts: Modifying Self-attention for Faithful Signal Propagation

Skip connections and normalisation layers form two standard architectura...
research
02/04/2021

Formalising a Turing-Complete Choreographic Language in Coq

Theory of choreographic languages typically includes a number of complex...

Please sign up or login with your details

Forgot password? Click here to reset