Product Kanerva Machines: Factorized Bayesian Memory

by   Adam Marblestone, et al.

An ideal cognitively-inspired memory system would compress and organize incoming items. The Kanerva Machine (Wu et al, 2018) is a Bayesian model that naturally implements online memory compression. However, the organization of the Kanerva Machine is limited by its use of a single Gaussian random matrix for storage. Here we introduce the Product Kanerva Machine, which dynamically combines many smaller Kanerva Machines. Its hierarchical structure provides a principled way to abstract invariant features and gives scaling and capacity advantages over single Kanerva Machines. We show that it can exhibit unsupervised clustering, find sparse and combinatorial allocation patterns, and discover spatial tunings that approximately factorize simple images by object.


page 4

page 17

page 18

page 19

page 20


SAGE: A Storage-Based Approach for Scalable and Efficient Sparse Generalized Matrix-Matrix Multiplication

Sparse generalized matrix-matrix multiplication (SpGEMM) is a fundamenta...

Prof. Schönhage's Mysterious Machines

We give a simple Schönhage Storage Modification Machine that simulates o...

Structured Memory for Neural Turing Machines

Neural Turing Machines (NTM) contain memory component that simulates "wo...

An Abstract Machine for Strong Call by Value

We present an abstract machine that implements a full-reducing (a.k.a. s...

Kanerva++: extending The Kanerva Machine with differentiable, locally block allocated latent memory

Episodic and semantic memory are critical components of the human memory...

Towards Effective and Efficient Padding Machines for Tor

Tor recently integrated a circuit padding framework for creating padding...

Bayesian learning of feature spaces for multitasks problems

This paper presents a Bayesian framework to construct non-linear, parsim...

Please sign up or login with your details

Forgot password? Click here to reset