The Global Structure of Codimension-2 Local Bifurcations in Continuous-Time Recurrent Neural Networks

11/08/2021
by   Randall D. Beer, et al.
11

If we are ever to move beyond the study of isolated special cases in theoretical neuroscience, we need to develop more general theories of neural circuits over a given neural model. The present paper considers this challenge in the context of continuous-time recurrent neural networks (CTRNNs), a simple but dynamically-universal model that has been widely utilized in both computational neuroscience and neural networks. Here we extend previous work on the parameter space structure of codimension-1 local bifurcations in CTRNNs to include codimension-2 local bifurcation manifolds. Specifically, we derive the necessary conditions for all generic local codimension-2 bifurcations for general CTRNNs, specialize these conditions to circuits containing from one to four neurons, illustrate in full detail the application of these conditions to example circuits, derive closed-form expressions for these bifurcation manifolds where possible, and demonstrate how this analysis allows us to find and trace several global codimension-1 bifurcation manifolds that originate from the codimension-2 bifurcations.

READ FULL TEXT
research
04/11/2023

Recurrent Neural Networks as Electrical Networks, a formalization

Since the 1980s, and particularly with the Hopfield model, recurrent neu...
research
12/09/2022

Emergent Computations in Trained Artificial Neural Networks and Real Brains

Synaptic plasticity allows cortical circuits to learn new tasks and to a...
research
05/12/2021

CCN GAC Workshop: Issues with learning in biological recurrent neural networks

This perspective piece came about through the Generative Adversarial Col...
research
11/01/2018

Liquid Time-constant Recurrent Neural Networks as Universal Approximators

In this paper, we introduce the notion of liquid time-constant (LTC) rec...
research
01/14/2020

Asymptotic Analysis for Overlap in Waveform Relaxation Methods for RC Type Circuits

Waveform relaxation (WR) methods are based on partitioning large circuit...
research
06/29/2006

May We Have Your Attention: Analysis of a Selective Attention Task

In this paper we present a deeper analysis than has previously been carr...
research
05/10/2023

Frequency-Supported Neural Networks for Nonlinear Dynamical System Identification

Neural networks are a very general type of model capable of learning var...

Please sign up or login with your details

Forgot password? Click here to reset