Universal Approximation of Linear Time-Invariant (LTI) Systems through RNNs: Power of Randomness in Reservoir Computing

08/04/2023
by   Shashank Jere, et al.
0

Recurrent neural networks (RNNs) are known to be universal approximators of dynamic systems under fairly mild and general assumptions, making them good tools to process temporal information. However, RNNs usually suffer from the issues of vanishing and exploding gradients in the standard RNN training. Reservoir computing (RC), a special RNN where the recurrent weights are randomized and left untrained, has been introduced to overcome these issues and has demonstrated superior empirical performance in fields as diverse as natural language processing and wireless communications especially in scenarios where training samples are extremely limited. On the contrary, the theoretical grounding to support this observed performance has not been fully developed at the same pace. In this work, we show that RNNs can provide universal approximation of linear time-invariant (LTI) systems. Specifically, we show that RC can universally approximate a general LTI system. We present a clear signal processing interpretation of RC and utilize this understanding in the problem of simulating a generic LTI system through RC. Under this setup, we analytically characterize the optimal probability distribution function for generating the recurrent weights of the underlying RNN of the RC. We provide extensive numerical evaluations to validate the optimality of the derived optimum distribution of the recurrent weights of the RC for the LTI system simulation problem. Our work results in clear signal processing-based model interpretability of RC and provides theoretical explanation for the power of randomness in setting instead of training RC's recurrent weights. It further provides a complete optimum analytical characterization for the untrained recurrent weights, marking an important step towards explainable machine learning (XML) which is extremely important for applications where training samples are limited.

READ FULL TEXT
research
02/23/2022

NeuroView-RNN: It's About Time

Recurrent Neural Networks (RNNs) are important tools for processing sequ...
research
05/16/2017

Hierarchical Temporal Representation in Linear Reservoir Computing

Recently, studies on deep Reservoir Computing (RC) highlighted the role ...
research
08/21/2023

Simple Cycle Reservoirs are Universal

Reservoir computation models form a subclass of recurrent neural network...
research
01/16/2018

A Comparison of Rule Extraction for Different Recurrent Neural Network Models and Grammatical Complexity

It has been shown that rules can be extracted from highly non-linear, re...
research
09/18/2020

Recurrent Graph Tensor Networks

Recurrent Neural Networks (RNNs) are among the most successful machine l...
research
05/31/2019

Improved memory in recurrent neural networks with sequential non-normal dynamics

Training recurrent neural networks (RNNs) is a hard problem due to degen...
research
11/28/2022

Metric entropy of causal, discrete-time LTI systems

In [1] it is shown that recurrent neural networks (RNNs) can learn - in ...

Please sign up or login with your details

Forgot password? Click here to reset