On the Computational Power of RNNs

06/14/2019
by   Samuel A. Korsky, et al.
0

Recent neural network architectures such as the basic recurrent neural network (RNN) and Gated Recurrent Unit (GRU) have gained prominence as end-to-end learning architectures for natural language processing tasks. But what is the computational power of such systems? We prove that finite precision RNNs with one hidden layer and ReLU activation and finite precision GRUs are exactly as computationally powerful as deterministic finite automata. Allowing arbitrary precision, we prove that RNNs with one hidden layer and ReLU activation are at least as computationally powerful as pushdown automata. If we also allow infinite precision, infinite edge weights, and nonlinear output activation functions, we prove that GRUs are at least as computationally powerful as pushdown automata. All results are shown constructively.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/13/2018

On the Practical Computational Power of Finite Precision RNNs for Language Recognition

While Recurrent Neural Networks (RNNs) are famously known to be Turing c...
research
09/12/2017

Shifting Mean Activation Towards Zero with Bipolar Activation Functions

We propose a simple extension to the ReLU-family of activation functions...
research
12/22/2022

Training Integer-Only Deep Recurrent Neural Networks

Recurrent neural networks (RNN) are the backbone of many text and speech...
research
06/27/2022

Extracting Weighted Finite Automata from Recurrent Neural Networks for Natural Languages

Recurrent Neural Networks (RNNs) have achieved tremendous success in seq...
research
04/01/2020

Distance and Equivalence between Finite State Machines and Recurrent Neural Networks: Computational results

The need of interpreting Deep Learning (DL) models has led, during the p...
research
07/04/2018

Connecting Weighted Automata and Recurrent Neural Networks through Spectral Learning

In this paper, we unravel a fundamental connection between weighted fini...
research
06/29/2023

On the Relationship Between RNN Hidden State Vectors and Semantic Ground Truth

We examine the assumption that the hidden-state vectors of recurrent neu...

Please sign up or login with your details

Forgot password? Click here to reset