
Echo State Networks with SelfNormalizing Activations on the HyperSphere
Among the various architectures of Recurrent Neural Networks, Echo State...
03/27/2019 ∙ by Pietro Verzelli, et al. ∙ 0 ∙ shareread it

Product Reservoir Computing: TimeSeries Computation with Multiplicative Neurons
Echo state networks (ESN), a type of reservoir computing (RC) architectu...
02/03/2015 ∙ by Alireza Goudarzi, et al. ∙ 0 ∙ shareread it

Echo State Queueing Network: a new reservoir computing learning tool
In the last decade, a new computational paradigm was introduced in the f...
12/26/2012 ∙ by Sebastián Basterrech, et al. ∙ 0 ∙ shareread it

Pretrainable Reservoir Computing with Recursive Neural Gas
Echo State Networks (ESN) are a class of Recurrent Neural Networks (RNN)...
07/25/2018 ∙ by Luca Carcano, et al. ∙ 2 ∙ shareread it

Characterizing the hyperparameter space of LSTM language models for mixed context applications
Applying state of the art deep learning models to novel real world datas...
12/08/2017 ∙ by Victor Akinwande, et al. ∙ 0 ∙ shareread it

Signal propagation in continuous approximations of binary neural networks
The training of stochastic neural network models with binary (±1) weight...
02/01/2019 ∙ by George Stamatescu, et al. ∙ 0 ∙ shareread it

Towards a Calculus of Echo State Networks
Reservoir computing is a recent trend in neural networks which uses the ...
09/01/2014 ∙ by Alireza Goudarzi, et al. ∙ 0 ∙ shareread it
A characterization of the Edge of Criticality in Binary Echo State Networks
Echo State Networks (ESNs) are simplified recurrent neural network models composed of a reservoir and a linear, trainable readout layer. The reservoir is tunable by some hyperparameters that control the network behaviour. ESNs are known to be effective in solving tasks when configured on a region in (hyper)parameter space called Edge of Criticality (EoC), where the system is maximally sensitive to perturbations hence affecting its behaviour. In this paper, we propose binary ESNs, which are architecturally equivalent to standard ESNs but consider binary activation functions and binary recurrent weights. For these networks, we derive a closedform expression for the EoC in the autonomous case and perform simulations in order to assess their behavior in the case of noisy neurons and in the presence of a signal. We propose a theoretical explanation for the fact that the variance of the input plays a major role in characterizing the EoC.
READ FULL TEXT