Cellular Automata Can Reduce Memory Requirements of Collective-State Computing

10/07/2020
by   Denis Kleyko, et al.
0

Various non-classical approaches of distributed information processing, such as neural networks, computation with Ising models, reservoir computing, vector symbolic architectures, and others, employ the principle of collective-state computing. In this type of computing, the variables relevant in a computation are superimposed into a single high-dimensional state vector, the collective-state. The variable encoding uses a fixed set of random patterns, which has to be stored and kept available during the computation. Here we show that an elementary cellular automaton with rule 90 (CA90) enables space-time tradeoff for collective-state computing models that use random dense binary representations, i.e., memory requirements can be traded off with computation running CA90. We investigate the randomization behavior of CA90, in particular, the relation between the length of the randomization period and the size of the grid, and how CA90 preserves similarity in the presence of the initialization noise. Based on these analyses we discuss how to optimize a collective-state computing model, in which CA90 expands representations on the fly from short seed patterns - rather than storing the full set of random patterns. The CA90 expansion is applied and tested in concrete scenarios using reservoir computing and vector symbolic architectures. Our experimental results show that collective-state computing with CA90 expansion performs similarly compared to traditional collective-state models, in which random patterns are generated initially by a pseudo-random number generator and then stored in a large memory.

READ FULL TEXT

page 1

page 6

research
03/16/2017

Reservoir Computing and Extreme Learning Machines using Pairs of Cellular Automata Rules

A framework for implementing reservoir computing (RC) and extreme learni...
research
02/28/2018

A theory of sequence indexing and working memory in recurrent neural networks

To accommodate structured approaches of neural computation, we propose a...
research
03/08/2017

Deep Reservoir Computing Using Cellular Automata

Recurrent Neural Networks (RNNs) have been a prominent concept within ar...
research
08/22/2023

ReLiCADA – Reservoir Computing using Linear Cellular Automata Design Algorithm

In this paper, we present a novel algorithm to optimize the design of Re...
research
09/28/2020

An Entropic Associative Memory

Natural memories are associative, declarative and distributed. Symbolic ...
research
04/05/2022

Neural Computing with Coherent Laser Networks

We show that a coherent network of lasers exhibits emergent neural compu...
research
01/23/2022

Imposing Connectome-Derived Topology on an Echo State Network

Can connectome-derived constraints inform computation? In this paper we ...

Please sign up or login with your details

Forgot password? Click here to reset