State-Regularized Recurrent Neural Networks

01/25/2019
by   Cheng Wang, et al.
16

Recurrent neural networks are a widely used class of neural architectures. They have, however, two shortcomings. First, it is difficult to understand what exactly they learn. Second, they tend to work poorly on sequences requiring long-term memorization, despite having this capacity in principle. We aim to address both shortcomings with a class of recurrent networks that use a stochastic state transition mechanism between cell applications. This mechanism, which we term state-regularization, makes RNNs transition between a finite set of learnable states. We evaluate state-regularized RNNs on (1) regular languages for the purpose of automata extraction; (2) nonregular languages such as balanced parentheses, palindromes, and the copy task where external memory is required; and (3) real-word sequence learning tasks for sentiment analysis, visual object recognition, and language modeling. We show that state-regularization (a) simplifies the extraction of finite state automata modeling an RNN's state transition dynamics; (b) forces RNNs to operate more like automata with external memory and less like finite state machines; (c) makes RNNs have better interpretability and explainability.

READ FULL TEXT

page 18

page 19

research
12/10/2022

State-Regularized Recurrent Neural Networks to Extract Automata and Explain Predictions

Recurrent neural networks are a widely used class of neural architecture...
research
06/12/2023

On the Dynamics of Learning Time-Aware Behavior with Recurrent Neural Networks

Recurrent Neural Networks (RNNs) have shown great success in modeling ti...
research
06/27/2022

Extracting Weighted Finite Automata from Recurrent Neural Networks for Natural Languages

Recurrent Neural Networks (RNNs) have achieved tremendous success in seq...
research
01/16/2018

A Comparison of Rule Extraction for Different Recurrent Neural Network Models and Grammatical Complexity

It has been shown that rules can be extracted from highly non-linear, re...
research
09/07/2019

The Neural State Pushdown Automata

In order to learn complex grammars, recurrent neural networks (RNNs) req...
research
10/17/2019

Probabilistic Deterministic Finite Automata and Recurrent Networks, Revisited

Reservoir computers (RCs) and recurrent neural networks (RNNs) can mimic...
research
09/06/2019

RNN Architecture Learning with Sparse Regularization

Neural models for NLP typically use large numbers of parameters to reach...

Please sign up or login with your details

Forgot password? Click here to reset