The Asymptotic Performance of Linear Echo State Neural Networks

03/25/2016
by   Romain Couillet, et al.
0

In this article, a study of the mean-square error (MSE) performance of linear echo-state neural networks is performed, both for training and testing tasks. Considering the realistic setting of noise present at the network nodes, we derive deterministic equivalents for the aforementioned MSE in the limit where the number of input data T and network size n both grow large. Specializing then the network connectivity matrix to specific random settings, we further obtain simple formulas that provide new insights on the performance of such networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/02/2022

Nonlinear Equalization for Optical Communications Based on Entropy-Regularized Mean Square Error

An entropy-regularized mean square error (MSE-X) cost function is propos...
research
11/18/2020

Statistical model-based evaluation of neural networks

Using a statistical model-based data generation, we develop an experimen...
research
05/22/2017

Cost-Performance Tradeoffs in Fusing Unreliable Computational Units

We investigate fusing several unreliable computational units that perfor...
research
09/15/2022

Training Neural Networks in Single vs Double Precision

The commitment to single-precision floating-point arithmetic is widespre...
research
02/06/2012

Cramer Rao-Type Bounds for Sparse Bayesian Learning

In this paper, we derive Hybrid, Bayesian and Marginalized Cramér-Rao lo...
research
01/28/2020

MSE-Optimal Neural Network Initialization via Layer Fusion

Deep neural networks achieve state-of-the-art performance for a range of...
research
12/17/2012

Co-clustering separately exchangeable network data

This article establishes the performance of stochastic blockmodels in ad...

Please sign up or login with your details

Forgot password? Click here to reset