A Formal Hierarchy of RNN Architectures

04/18/2020
by   William Merrill, et al.
0

We develop a formal hierarchy of the expressive capacity of RNN architectures. The hierarchy is based on two formal properties: space complexity, which measures the RNN's memory, and rational recurrence, defined as whether the recurrent update can be described by a weighted finite-state machine. We place several RNN variants within this hierarchy. For example, we prove the LSTM is not rational, which formally separates it from the related QRNN (Bradbury et al., 2016). We also show how these models' expressive capacity is expanded by stacking multiple layers or composing them with different pooling functions. Our results build on the theory of "saturated" RNNs (Merrill, 2019). While formally extending these findings to unsaturated RNNs is left to future work, we hypothesize that the practical learnable capacity of unsaturated RNNs obeys a similar hierarchy. Experimental findings from training unsaturated networks on formal languages support this conjecture.

READ FULL TEXT
02/24/2017

Analyzing and Exploiting NARX Recurrent Neural Networks for Long-Term Dependencies

Recurrent neural networks (RNNs) have achieved state-of-the-art performa...
11/29/2016

Capacity and Trainability in Recurrent Neural Networks

Two potential bottlenecks on the expressiveness of recurrent neural netw...
04/29/2019

Recurrent Neural Networks in the Eye of Differential Equations

To understand the fundamental trade-offs between training stability, tem...
09/06/2019

RNN Architecture Learning with Sparse Regularization

Neural models for NLP typically use large numbers of parameters to reach...
04/16/2021

An expressiveness hierarchy of Behavior Trees and related architectures

In this paper we provide a formal framework for comparing the expressive...
04/01/2020

Distance and Equivalence between Finite State Machines and Recurrent Neural Networks: Computational results

The need of interpreting Deep Learning (DL) models has led, during the p...
06/12/2020

A Formal Language Approach to Explaining RNNs

This paper presents LEXR, a framework for explaining the decision making...