Composing Recurrent Spiking Neural Networks using Locally-Recurrent Motifs and Risk-Mitigating Architectural Optimization

08/04/2021
by   Wenrui Zhang, et al.
0

In neural circuits, recurrent connectivity plays a crucial role in network function and stability. However, existing recurrent spiking neural networks (RSNNs) are often constructed by random connections without optimization. While RSNNs can produce rich dynamics that are critical for memory formation and learning, systemic architectural optimization of RSNNs is still an opening challenge. We aim to enable systemic design of large RSNNs via a new scalable RSNN architecture and automated architectural optimization. We compose RSNNs based on a layer architecture called Sparsely-Connected Recurrent Motif Layer (SC-ML) that consists of multiple small recurrent motifs wired together by sparse lateral connections. The small size of the motifs and sparse inter-motif connectivity leads to an RSNN architecture scalable to large network sizes. We further propose a method called Hybrid Risk-Mitigating Architectural Search (HRMAS) to systematically optimize the topology of the proposed recurrent motifs and SC-ML layer architecture. HRMAS is an alternating two-step optimization process by which we mitigate the risk of network instability and performance degradation caused by architectural change by introducing a novel biologically-inspired "self-repairing" mechanism through intrinsic plasticity. The intrinsic plasticity is introduced to the second step of each HRMAS iteration and acts as unsupervised fast self-adaption to structural and synaptic weight modifications introduced by the first step during the RSNN architectural "evolution". To the best of the authors' knowledge, this is the first work that performs systematic architectural optimization of RSNNs. Using one speech and three neuromorphic datasets, we demonstrate the significant performance improvement brought by the proposed automated architecture optimization over existing manually-designed RSNNs.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/23/2020

Skip-Connected Self-Recurrent Spiking Neural Networks with Joint Intrinsic Parameter and Synaptic Weight Training

As an important class of spiking neural networks (SNNs), recurrent spiki...
research
09/11/2023

Brain-inspired Evolutionary Architectures for Spiking Neural Networks

The complex and unique neural network topology of the human brain formed...
research
03/31/2023

Adaptive structure evolution and biologically plausible synaptic plasticity for recurrent spiking neural networks

The architecture design and multi-scale learning principles of the human...
research
02/04/2020

Multi-Objective Optimization for Size and Resilience of Spiking Neural Networks

Inspired by the connectivity mechanisms in the brain, neuromorphic compu...
research
11/22/2022

Adaptive Sparse Structure Development with Pruning and Regeneration for Spiking Neural Networks

Spiking Neural Networks (SNNs) are more biologically plausible and compu...
research
03/27/2018

Inferring network connectivity from event timing patterns

Reconstructing network connectivity from the collective dynamics of a sy...

Please sign up or login with your details

Forgot password? Click here to reset