Connecting Weighted Automata and Recurrent Neural Networks through Spectral Learning

07/04/2018
by   Guillaume Rabusseau, et al.
0

In this paper, we unravel a fundamental connection between weighted finite automata (WFAs) and second-order recurrent neural networks (2-RNNs): in the case of sequences of discrete symbols, WFAs and 2-RNNs with linear activation functions are expressively equivalent. Motivated by this result, we build upon a recent extension of the spectral learning algorithm to vector-valued WFAs and propose the first provable learning algorithm for linear 2-RNNs defined over sequences of continuous input vectors. This algorithm relies on estimating low rank sub-blocks of the so-called Hankel tensor, from which the parameters of a linear 2-RNN can be provably recovered. The performances of the proposed method are assessed in a simulation study.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/19/2020

Connecting Weighted Automata, Tensor Networks and Recurrent Neural Networks through Spectral Learning

In this paper, we present connections between three models used in diffe...
research
09/13/2017

Neural Network Based Nonlinear Weighted Finite Automata

Weighted finite automata (WFA) can expressively model functions defined ...
research
04/05/2019

Weighted Automata Extraction from Recurrent Neural Networks via Regression on State Spaces

We present a method to extract a weighted finite automaton (WFA) from a ...
research
01/16/2018

A Comparison of Rule Extraction for Different Recurrent Neural Network Models and Grammatical Complexity

It has been shown that rules can be extracted from highly non-linear, re...
research
06/14/2019

On the Computational Power of RNNs

Recent neural network architectures such as the basic recurrent neural n...
research
10/30/2019

Learning Deterministic Weighted Automata with Queries and Counterexamples

We present an algorithm for extraction of a probabilistic deterministic ...
research
06/29/2023

On the Relationship Between RNN Hidden State Vectors and Semantic Ground Truth

We examine the assumption that the hidden-state vectors of recurrent neu...

Please sign up or login with your details

Forgot password? Click here to reset