Efficient Similarity-Preserving Unsupervised Learning using Modular Sparse Distributed Codes and Novelty-Contingent Noise

10/19/2020
by   Rod Rinkus, et al.
0

There is increasing realization in neuroscience that information is represented in the brain, e.g., neocortex, hippocampus, in the form sparse distributed codes (SDCs), a kind of cell assembly. Two essential questions are: a) how are such codes formed on the basis of single trials, and how is similarity preserved during learning, i.e., how do more similar inputs get mapped to more similar SDCs. I describe a novel Modular Sparse Distributed Code (MSDC) that provides simple, neurally plausible answers to both questions. An MSDC coding field (CF) consists of Q WTA competitive modules (CMs), each comprised of K binary units (analogs of principal cells). The modular nature of the CF makes possible a single-trial, unsupervised learning algorithm that approximately preserves similarity and crucially, runs in fixed time, i.e., the number of steps needed to store an item remains constant as the number of stored items grows. Further, once items are stored as MSDCs in superposition and such that their intersection structure reflects input similarity, both fixed time best-match retrieval and fixed time belief update (updating the probabilities of all stored items) also become possible. The algorithm's core principle is simply to add noise into the process of choosing a code, i.e., choosing a winner in each CM, which is proportional to the novelty of the input. This causes the expected intersection of the code for an input, X, with the code of each previously stored input, Y, to be proportional to the similarity of X and Y. Results demonstrating these capabilities for spatial patterns are given in the appendix.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/26/2017

A Radically New Theory of how the Brain Represents and Computes with Probabilities

The brain is believed to implement probabilistic reasoning and to repres...
research
10/21/2017

Superposed Episodic and Semantic Memory via Sparse Distributed Representation

The abilities to perceive, learn, and use generalities, similarities, cl...
research
11/12/2016

Sparsey: Event Recognition via Deep Hierarchical Spare Distributed Codes

Visual cortex's hierarchical, multi-level organization is captured in ma...
research
07/15/2017

Quantum Computation via Sparse Distributed Representation

Quantum superposition says that any physical system simultaneously exist...
research
09/04/2017

Neural Distributed Autoassociative Memories: A Survey

Introduction. Neural network models of autoassociative, distributed memo...
research
09/18/2019

Generation mechanism of cell assembly to store information about hand recognition

A specific memory is stored in a cell assembly that is activated during ...

Please sign up or login with your details

Forgot password? Click here to reset