Vector Symbolic Finite State Machines in Attractor Neural Networks

12/02/2022
by   Madison Cotteret, et al.
0

Hopfield attractor networks are robust distributed models of human memory. We propose construction rules such that an attractor network may implement an arbitrary finite state machine (FSM), where states and stimuli are represented by high-dimensional random bipolar vectors, and all state transitions are enacted by the attractor network's dynamics. Numerical simulations show the capacity of the model, in terms of the maximum size of implementable FSM, to be linear in the size of the attractor network. We show that the model is robust to imprecise and noisy weights, and so a prime candidate for implementation with high-density but unreliable devices. By endowing attractor networks with the ability to emulate arbitrary FSMs, we propose a plausible path by which FSMs may exist as a distributed computational primitive in biological neural networks.

READ FULL TEXT

page 7

page 8

page 9

page 10

research
10/26/2021

Biological learning in key-value memory networks

In neuroscience, classical Hopfield networks are the standard biological...
research
06/07/2023

Long Sequence Hopfield Memory

Sequence memory is an essential attribute of natural and artificial inte...
research
03/19/2019

Surprises in High-Dimensional Ridgeless Least Squares Interpolation

Interpolators -- estimators that achieve zero training error -- have att...
research
07/13/2023

Learning fixed points of recurrent neural networks by reparameterizing the network model

In computational neuroscience, fixed points of recurrent neural networks...
research
05/03/2014

Spatial Neural Networks and their Functional Samples: Similarities and Differences

Models of neural networks have proven their utility in the development o...
research
01/17/2020

On the Capacity of Private Monomial Computation

In this work, we consider private monomial computation (PMC) for replica...
research
10/05/2020

Robust High-dimensional Memory-augmented Neural Networks

Traditional neural networks require enormous amounts of data to build th...

Please sign up or login with your details

Forgot password? Click here to reset