On the Linguistic Capacity of Real-Time Counter Automata

04/15/2020
by   William Merrill, et al.
0

Counter machines have achieved a newfound relevance to the field of natural language processing (NLP): recent work suggests some strong-performing recurrent neural networks utilize their memory as counters. Thus, one potential way to understand the success of these networks is to revisit the theory of counter computation. Therefore, we study the abilities of real-time counter machines as formal grammars, focusing on formal properties that are relevant for NLP models. We first show that several variants of the counter machine converge to express the same class of formal languages. We also prove that counter languages are closed under complement, union, intersection, and many other common set operations. Next, we show that counter machines cannot evaluate boolean expressions, even though they can weakly validate their syntax. This has implications for the interpretability and evaluation of neural network systems: successfully matching syntactic patterns does not guarantee that counter memory accurately encodes compositional semantics. Finally, we consider whether counter languages are semilinear. This work makes general contributions to the theory of formal languages that are of potential interest for understanding recurrent neural networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/04/2019

Sequential Neural Networks as Automata

This work attempts to explain the types of computation that neural netwo...
research
02/19/2021

Formal Language Theory Meets Modern NLP

NLP is deeply intertwined with the formal study of language, both concep...
research
10/07/2019

The Well Structured Problem for Presburger Counter Machines

We introduce the well structured problem as the question of whether a mo...
research
06/15/2020

Some complete ω-powers of a one-counter language, for any Borel class of finite rank

We prove that, for any natural number n ≥ 1, we can find a finite alphab...
research
09/23/2020

On the Ability of Self-Attention Networks to Recognize Counter Languages

Transformers have supplanted recurrent models in a large number of NLP t...
research
04/10/2018

Counter Machines and Distributed Automata: A Story about Exchanging Space and Time

We prove the equivalence of two classes of counter machines and one clas...
research
09/11/2018

Limitations in learning an interpreted language with recurrent models

In this submission I report work in progress on learning simplified inte...

Please sign up or login with your details

Forgot password? Click here to reset