A Step Towards Uncovering The Structure of Multistable Neural Networks

10/06/2022
by   Magnus Tournoy, et al.
0

We study the structure of multistable recurrent neural networks. The activation function is simplified by a nonsmooth Heaviside step function. This nonlinearity partitions the phase space into regions with different, yet linear dynamics. We derive how multistability is encoded within the network architecture. Stable states are identified by their semipositivity constraints on the synaptic weight matrix. The restrictions can be separated by their effects on the signs or the strengths of the connections. Exact results on network topology, sign stability, weight matrix factorization, pattern completion and pattern coupling are derived and proven. These may lay the foundation of more complex recurrent neural networks and neurocomputing.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/18/2020

Stability of Internal States in Recurrent Neural Networks Trained on Regular Languages

We provide an empirical study of the stability of recurrent neural netwo...
research
09/21/2022

Interneurons accelerate learning dynamics in recurrent neural networks for statistical adaptation

Early sensory systems in the brain rapidly adapt to fluctuating input st...
research
12/31/2021

Training and Generating Neural Networks in Compressed Weight Space

The inputs and/or outputs of some neural nets are weight matrices of oth...
research
02/21/2023

LU-Net: Invertible Neural Networks Based on Matrix Factorization

LU-Net is a simple and fast architecture for invertible neural networks ...
research
05/03/2017

Dataflow Matrix Machines as a Model of Computations with Linear Streams

We overview dataflow matrix machines as a Turing complete generalization...
research
05/17/2020

Separation of Memory and Processing in Dual Recurrent Neural Networks

We explore a neural network architecture that stacks a recurrent layer a...
research
05/17/2016

Dataflow matrix machines as programmable, dynamically expandable, self-referential generalized recurrent neural networks

Dataflow matrix machines are a powerful generalization of recurrent neur...

Please sign up or login with your details

Forgot password? Click here to reset