Takens-inspired neuromorphic processor: a downsizing tool for random recurrent neural networks via feature extraction

07/06/2019
by   Bicky A. Marquez, et al.
1

We describe a new technique which minimizes the amount of neurons in the hidden layer of a random recurrent neural network (rRNN) for time series prediction. Merging Takens-based attractor reconstruction methods with machine learning, we identify a mechanism for feature extraction that can be leveraged to lower the network size. We obtain criteria specific to the particular prediction task and derive the scaling law of the prediction error. The consequences of our theory are demonstrated by designing a Takens-inspired hybrid processor, which extends a rRNN with a priori designed delay external memory. Our hybrid architecture is therefore designed including both, real and virtual nodes. Via this symbiosis, we show performance of the hybrid processor by stabilizing an arrhythmic neural model. Thanks to our obtained design rules, we can reduce the stabilizing neural network's size by a factor of 15 with respect to a standard system.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/14/2016

Convolutional Recurrent Neural Networks for Music Classification

We introduce a convolutional recurrent neural network (CRNN) for music t...
research
09/07/2021

CRNNTL: convolutional recurrent neural network and transfer learning for QSAR modelling

In this study, we propose the convolutional recurrent neural network and...
research
09/11/2020

P-CRITICAL: A Reservoir Autoregulation Plasticity Rule for Neuromorphic Hardware

Backpropagation algorithms on recurrent artificial neural networks requi...
research
03/06/2019

Autoregressive Convolutional Recurrent Neural Network for Univariate and Multivariate Time Series Prediction

Time Series forecasting (univariate and multivariate) is a problem of hi...
research
04/24/2018

Genesis of Basic and Multi-Layer Echo State Network Recurrent Autoencoders for Efficient Data Representations

It is a widely accepted fact that data representations intervene noticea...
research
07/22/2014

Deep Recurrent Neural Networks for Time Series Prediction

Ability of deep networks to extract high level features and of recurrent...
research
03/25/2022

A Hybrid Framework for Sequential Data Prediction with End-to-End Optimization

We investigate nonlinear prediction in an online setting and introduce a...

Please sign up or login with your details

Forgot password? Click here to reset