Exploring Transfer Function Nonlinearity in Echo State Networks

02/16/2015
by   Alireza Goudarzi, et al.
0

Supralinear and sublinear pre-synaptic and dendritic integration is considered to be responsible for nonlinear computation power of biological neurons, emphasizing the role of nonlinear integration as opposed to nonlinear output thresholding. How, why, and to what degree the transfer function nonlinearity helps biologically inspired neural network models is not fully understood. Here, we study these questions in the context of echo state networks (ESN). ESN is a simple neural network architecture in which a fixed recurrent network is driven with an input signal, and the output is generated by a readout layer from the measurements of the network states. ESN architecture enjoys efficient training and good performance on certain signal-processing tasks, such as system identification and time series prediction. ESN performance has been analyzed with respect to the connectivity pattern in the network structure and the input bias. However, the effects of the transfer function in the network have not been studied systematically. Here, we use an approach tanh on the Taylor expansion of a frequently used transfer function, the hyperbolic tangent function, to systematically study the effect of increasing nonlinearity of the transfer function on the memory, nonlinear capacity, and signal processing performance of ESN. Interestingly, we find that a quadratic approximation is enough to capture the computational power of ESN with tanh function. The results of this study apply to both software and hardware implementation of ESN.

READ FULL TEXT
research
02/03/2015

Product Reservoir Computing: Time-Series Computation with Multiplicative Neurons

Echo state networks (ESN), a type of reservoir computing (RC) architectu...
research
02/06/2018

Brain-inspired photonic signal processor for periodic pattern generation and chaotic system emulation

Reservoir computing is a bio-inspired computing paradigm for processing ...
research
01/24/2023

Neuronal architecture extracts statistical temporal patterns

Neuronal systems need to process temporal signals. We here show how high...
research
07/31/2018

Universal Approximation with Quadratic Deep Networks

Recently, deep learning has been playing a central role in machine learn...
research
12/10/2019

On the Convex Properties of Wireless Power Transfer with Nonlinear Energy Harvesting

The convex property of a nonlinear wireless power transfer (WPT) is char...
research
02/25/2021

Power series expansion neural network

In this paper, we develop a new neural network family based on power ser...
research
10/31/2018

Adaptive Extreme Learning Machine for Recurrent Beta-basis Function Neural Network Training

Beta Basis Function Neural Network (BBFNN) is a special kind of kernel b...

Please sign up or login with your details

Forgot password? Click here to reset