State-Regularized Recurrent Neural Networks to Extract Automata and Explain Predictions

12/10/2022
by   Cheng Wang, et al.
0

Recurrent neural networks are a widely used class of neural architectures. They have, however, two shortcomings. First, they are often treated as black-box models and as such it is difficult to understand what exactly they learn as well as how they arrive at a particular prediction. Second, they tend to work poorly on sequences requiring long-term memorization, despite having this capacity in principle. We aim to address both shortcomings with a class of recurrent networks that use a stochastic state transition mechanism between cell applications. This mechanism, which we term state-regularization, makes RNNs transition between a finite set of learnable states. We evaluate state-regularized RNNs on (1) regular languages for the purpose of automata extraction; (2) non-regular languages such as balanced parentheses and palindromes where external memory is required; and (3) real-word sequence learning tasks for sentiment analysis, visual object recognition and text categorisation. We show that state-regularization (a) simplifies the extraction of finite state automata that display an RNN's state transition dynamic; (b) forces RNNs to operate more like automata with external memory and less like finite state machines, which potentiality leads to a more structural memory; (c) leads to better interpretability and explainability of RNNs by leveraging the probabilistic finite state transition mechanism over time steps.

READ FULL TEXT

page 9

page 12

research
01/25/2019

State-Regularized Recurrent Neural Networks

Recurrent neural networks are a widely used class of neural architecture...
research
01/28/2022

Extracting Finite Automata from RNNs Using State Merging

One way to interpret the behavior of a blackbox recurrent neural network...
research
06/27/2022

Extracting Weighted Finite Automata from Recurrent Neural Networks for Natural Languages

Recurrent Neural Networks (RNNs) have achieved tremendous success in seq...
research
10/17/2019

Probabilistic Deterministic Finite Automata and Recurrent Networks, Revisited

Reservoir computers (RCs) and recurrent neural networks (RNNs) can mimic...
research
09/07/2019

The Neural State Pushdown Automata

In order to learn complex grammars, recurrent neural networks (RNNs) req...
research
09/10/2020

On Computability, Learnability and Extractability of Finite State Machines from Recurrent Neural Networks

This work aims at shedding some light on connections between finite stat...
research
10/30/2019

Learning Deterministic Weighted Automata with Queries and Counterexamples

We present an algorithm for extraction of a probabilistic deterministic ...

Please sign up or login with your details

Forgot password? Click here to reset