Distance and Equivalence between Finite State Machines and Recurrent Neural Networks: Computational results

04/01/2020
by   Reda Marzouk, et al.
0

The need of interpreting Deep Learning (DL) models has led, during the past years, to a proliferation of works concerned by this issue. Among strategies which aim at shedding some light on how information is represented internally in DL models, one consists in extracting symbolic rule-based machines from connectionist models that are supposed to approximate well their behaviour. In order to better understand how reasonable these approximation strategies are, we need to know the computational complexity of measuring the quality of approximation. In this article, we will prove some computational results related to the problem of extracting Finite State Machine (FSM) based models from trained RNN Language models. More precisely, we'll show the following: (a) For general weighted RNN-LMs with a single hidden layer and a ReLu activation: - The equivalence problem of a PDFA/PFA/WFA and a weighted first-order RNN-LM is undecidable; - As a corollary, the distance problem between languages generated by PDFA/PFA/WFA and that of a weighted RNN-LM is not recursive; -The intersection between a DFA and the cut language of a weighted RNN-LM is undecidable; - The equivalence of a PDFA/PFA/WFA and weighted RNN-LM in a finite support is EXP-Hard; (b) For consistent weight RNN-LMs with any computable activation function: - The Tcheybechev distance approximation is decidable; - The Tcheybechev distance approximation in a finite support is NP-Hard. Moreover, our reduction technique from 3-SAT makes this latter fact easily generalizable to other RNN architectures (e.g. LSTMs/RNNs), and RNNs with finite precision.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/10/2020

On Computability, Learnability and Extractability of Finite State Machines from Recurrent Neural Networks

This work aims at shedding some light on connections between finite stat...
research
11/15/2017

Recurrent Neural Networks as Weighted Language Recognizers

We investigate computational complexity of questions of various problems...
research
05/13/2018

On the Practical Computational Power of Finite Precision RNNs for Language Recognition

While Recurrent Neural Networks (RNNs) are famously known to be Turing c...
research
06/14/2019

On the Computational Power of RNNs

Recent neural network architectures such as the basic recurrent neural n...
research
11/25/2022

Minimal Width for Universal Property of Deep RNN

A recurrent neural network (RNN) is a widely used deep-learning network ...
research
04/18/2020

A Formal Hierarchy of RNN Architectures

We develop a formal hierarchy of the expressive capacity of RNN architec...
research
08/30/2019

A single-layer RNN can approximate stacked and bidirectional RNNs, and topologies in between

To enhance the expressiveness and representational capacity of recurrent...

Please sign up or login with your details

Forgot password? Click here to reset