A comparison of Vector Symbolic Architectures

by   Kenny Schlegel, et al.

Vector Symbolic Architectures (VSAs) combine a high-dimensional vector space with a set of carefully designed operators in order to perform symbolic computations with large numerical vectors. Major goals are the exploitation of their representational power and ability to deal with fuzziness and ambiguity. Over the past years, VSAs have been applied to a broad range of tasks and several VSA implementations have been proposed. The available implementations differ in the underlying vector space (e.g., binary vectors or complex-valued vectors) and the particular implementations of the required VSA operators - with important ramifications for the properties of these architectures. For example, not every VSA is equally well suited to address each task, including complete incompatibility. In this paper, we give an overview of eight available VSA implementations and discuss their commonalities and differences in the underlying vector space, bundling, and binding/unbinding operations. We create a taxonomy of available binding/unbinding operations and show an important ramification for non self-inverse binding operation using an example from analogical reasoning. A main contribution is the experimental comparison of the available implementations regarding (1) the capacity of bundles, (2) the approximation quality of non-exact unbinding operations, and (3) the influence of combined binding and bundling operations on the query answering performance. We expect this systematization and comparison to be relevant for development and evaluation of new VSAs, but most importantly, to support the selection of an appropriate VSA for a particular task.


page 1

page 2

page 3

page 4


Capacity Analysis of Vector Symbolic Architectures

Hyperdimensional computing (HDC) is a biologically-inspired framework wh...

Vector Symbolic Architectures as a Computing Framework for Nanoscale Hardware

This article reviews recent progress in the development of the computing...

A Proof-Theoretic Approach to Scope Ambiguity in Compositional Vector Space Models

We investigate the extent to which compositional vector space models can...

Computing on Functions Using Randomized Vector Representations

Vector space models for symbolic processing that encode symbols by rando...

Deductive and Analogical Reasoning on a Semantically Embedded Knowledge Graph

Representing knowledge as high-dimensional vectors in a continuous seman...

Variable Binding for Sparse Distributed Representations: Theory and Applications

Symbolic reasoning and neural networks are often considered incompatible...

Integrating Generic Sensor Fusion Algorithms with Sound State Representations through Encapsulation of Manifolds

Common estimation algorithms, such as least squares estimation or the Ka...

Please sign up or login with your details

Forgot password? Click here to reset