
Connecting Weighted Automata, Tensor Networks and Recurrent Neural Networks through Spectral Learning
In this paper, we present connections between three models used in diffe...
read it

Neural Network Based Nonlinear Weighted Finite Automata
Weighted finite automata (WFA) can expressively model functions defined ...
read it

Weighted Automata Extraction from Recurrent Neural Networks via Regression on State Spaces
We present a method to extract a weighted finite automaton (WFA) from a ...
read it

A Comparison of Rule Extraction for Different Recurrent Neural Network Models and Grammatical Complexity
It has been shown that rules can be extracted from highly nonlinear, re...
read it

Learning Deterministic Weighted Automata with Queries and Counterexamples
We present an algorithm for extraction of a probabilistic deterministic ...
read it

On the Computational Power of RNNs
Recent neural network architectures such as the basic recurrent neural n...
read it

SoPa: Bridging CNNs, RNNs, and Weighted FiniteState Machines
Recurrent and convolutional neural networks comprise two distinct famili...
read it
Connecting Weighted Automata and Recurrent Neural Networks through Spectral Learning
In this paper, we unravel a fundamental connection between weighted finite automata (WFAs) and secondorder recurrent neural networks (2RNNs): in the case of sequences of discrete symbols, WFAs and 2RNNs with linear activation functions are expressively equivalent. Motivated by this result, we build upon a recent extension of the spectral learning algorithm to vectorvalued WFAs and propose the first provable learning algorithm for linear 2RNNs defined over sequences of continuous input vectors. This algorithm relies on estimating low rank subblocks of the socalled Hankel tensor, from which the parameters of a linear 2RNN can be provably recovered. The performances of the proposed method are assessed in a simulation study.
READ FULL TEXT
Comments
There are no comments yet.