Unfolding recurrence by Green's functions for optimized reservoir computing

10/13/2020
by   Sandra Nestler, et al.
0

Cortical networks are strongly recurrent, and neurons have intrinsic temporal dynamics. This sets them apart from deep feed-forward networks. Despite the tremendous progress in the application of feed-forward networks and their theoretical understanding, it remains unclear how the interplay of recurrence and non-linearities in recurrent cortical networks contributes to their function. The purpose of this work is to present a solvable recurrent network model that links to feed forward networks. By perturbative methods we transform the time-continuous, recurrent dynamics into an effective feed-forward structure of linear and non-linear temporal kernels. The resulting analytical expressions allow us to build optimal time-series classifiers from random reservoir networks. Firstly, this allows us to optimize not only the readout vectors, but also the input projection, demonstrating a strong potential performance gain. Secondly, the analysis exposes how the second order stimulus statistics is a crucial element that interacts with the non-linearity of the dynamics and boosts performance.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/21/2017

Feed-forward approximations to dynamic recurrent network architectures

Recurrent neural network architectures can have useful computational pro...
research
10/29/2020

Overcoming The Limitations of Neural Networks in Composite-Pattern Learning with Architopes

The effectiveness of neural networks in solving complex problems is well...
research
10/20/2017

Point Neurons with Conductance-Based Synapses in the Neural Engineering Framework

The mathematical model underlying the Neural Engineering Framework (NEF)...
research
02/28/2020

Temporal Convolutional Attention-based Network For Sequence Modeling

With the development of feed-forward models, the default model for seque...
research
05/12/2023

Optimal signal propagation in ResNets through residual scaling

Residual networks (ResNets) have significantly better trainability and t...
research
10/02/2021

Recurrent circuits as multi-path ensembles for modeling responses of early visual cortical neurons

In this paper, we showed that adding within-layer recurrent connections ...
research
10/21/2018

Transition-based Parsing with Lighter Feed-Forward Networks

We explore whether it is possible to build lighter parsers, that are sta...

Please sign up or login with your details

Forgot password? Click here to reset