Complexity-calibrated Benchmarks for Machine Learning Reveal When Next-Generation Reservoir Computer Predictions Succeed and Mislead

03/25/2023
by   Sarah E. Marzen, et al.
0

Recurrent neural networks are used to forecast time series in finance, climate, language, and from many other domains. Reservoir computers are a particularly easily trainable form of recurrent neural network. Recently, a "next-generation" reservoir computer was introduced in which the memory trace involves only a finite number of previous symbols. We explore the inherent limitations of finite-past memory traces in this intriguing proposal. A lower bound from Fano's inequality shows that, on highly non-Markovian processes generated by large probabilistic state machines, next-generation reservoir computers with reasonably long memory traces have an error probability that is at least   60 predicting the next observation. More generally, it appears that popular recurrent neural networks fall far short of optimally predicting such complex processes. These results highlight the need for a new generation of optimized recurrent neural network architectures. Alongside this finding, we present concentration-of-measure results for randomly-generated but complex processes. One conclusion is that large probabilistic state machines – specifically, large ϵ-machines – are key to generating challenging and structurally-unbiased stimuli for ground-truthing recurrent neural network architectures.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/08/2017

Tailoring Artificial Neural Networks for Optimal Learning

As one of the most important paradigms of recurrent neural networks, the...
research
10/17/2019

Probabilistic Deterministic Finite Automata and Recurrent Networks, Revisited

Reservoir computers (RCs) and recurrent neural networks (RNNs) can mimic...
research
10/01/2019

Forecasting Chaotic Systems with Very Low Connectivity Reservoir Computers

We explore the hyperparameter space of reservoir computers used for fore...
research
09/14/2020

Reservoir Memory Machines as Neural Computers

Differentiable neural computers extend artificial neural networks with a...
research
05/04/2021

Reservoir Stack Machines

Memory-augmented neural networks equip a recurrent neural network with a...
research
05/19/2022

Analyzing Echo-state Networks Using Fractal Dimension

This work joins aspects of reservoir optimization, information-theoretic...
research
03/29/2016

Dataflow Matrix Machines as a Generalization of Recurrent Neural Networks

Dataflow matrix machines are a powerful generalization of recurrent neur...

Please sign up or login with your details

Forgot password? Click here to reset