Predicting Performance using Approximate State Space Model for Liquid State Machines

01/18/2019
by   Ajinkya Gorad, et al.
0

Liquid State Machine (LSM) is a brain-inspired architecture used for solving problems like speech recognition and time series prediction. LSM comprises of a randomly connected recurrent network of spiking neurons. This network propagates the non-linear neuronal and synaptic dynamics. Maass et al. have argued that the non-linear dynamics of LSMs is essential for its performance as a universal computer. Lyapunov exponent (mu), used to characterize the "non-linearity" of the network, correlates well with LSM performance. We propose a complementary approach of approximating the LSM dynamics with a linear state space representation. The spike rates from this model are well correlated to the spike rates from LSM. Such equivalence allows the extraction of a "memory" metric (tau_M) from the state transition matrix. tau_M displays high correlation with performance. Further, high tau_M system require lesser epochs to achieve a given accuracy. Being computationally cheap (1800x time efficient compared to LSM), the tau_M metric enables exploration of the vast parameter design space. We observe that the performance correlation of the tau_M surpasses the Lyapunov exponent (mu), (2-4x improvement) in the high-performance regime over multiple datasets. In fact, while mu increases monotonically with network activity, the performance reaches a maxima at a specific activity described in literature as the "edge of chaos". On the other hand, tau_M remains correlated with LSM performance even as mu increases monotonically. Hence, tau_M captures the useful memory of network activity that enables LSM performance. It also enables rapid design space exploration and fine-tuning of LSM parameters for high performance.

READ FULL TEXT

page 1

page 6

research
02/22/2023

Heterogeneous Neuronal and Synaptic Dynamics for Spike-Efficient Unsupervised Learning: Theory and Design Principles

This paper shows that the heterogeneity in neuronal and synaptic dynamic...
research
10/26/2021

Increasing Liquid State Machine Performance with Edge-of-Chaos Dynamics Organized by Astrocyte-modulated Plasticity

The liquid state machine (LSM) combines low training complexity and biol...
research
01/05/2016

The high-conductance state enables neural sampling in networks of LIF neurons

The apparent stochasticity of in-vivo neural circuits has long been hypo...
research
09/16/2019

Reservoirs learn to learn

We consider reservoirs in the form of liquid state machines, i.e., recur...
research
07/20/2019

Unsupervised Separation of Dynamics from Pixels

We present an approach to learn the dynamics of multiple objects from im...

Please sign up or login with your details

Forgot password? Click here to reset