Gluing Neural Networks Symbolically Through Hyperdimensional Computing

by   Peter Sutor, et al.

Hyperdimensional Computing affords simple, yet powerful operations to create long Hyperdimensional Vectors (hypervectors) that can efficiently encode information, be used for learning, and are dynamic enough to be modified on the fly. In this paper, we explore the notion of using binary hypervectors to directly encode the final, classifying output signals of neural networks in order to fuse differing networks together at the symbolic level. This allows multiple neural networks to work together to solve a problem, with little additional overhead. Output signals just before classification are encoded as hypervectors and bundled together through consensus summation to train a classification hypervector. This process can be performed iteratively and even on single neural networks by instead making a consensus of multiple classification hypervectors. We find that this outperforms the state of the art, or is on a par with it, while using very little overhead, as hypervector operations are extremely fast and efficient in comparison to the neural networks. This consensus process can learn online and even grow or lose models in real time. Hypervectors act as memories that can be stored, and even further bundled together over time, affording life long learning capabilities. Additionally, this consensus structure inherits the benefits of Hyperdimensional Computing, without sacrificing the performance of modern Machine Learning. This technique can be extrapolated to virtually any neural model, and requires little modification to employ - one simply requires recording the output signals of networks when presented with a testing example.


page 1

page 3

page 5


Application of neural networks to classification of data of the TUS orbital telescope

We employ neural networks for classification of data of the TUS fluoresc...

Convolutional Neural Networks for Sentence Classification

We report on a series of experiments with convolutional neural networks ...

Learning to Time-Decode in Spiking Neural Networks Through the Information Bottleneck

One of the key challenges in training Spiking Neural Networks (SNNs) is ...

RMWPaxos: Fault-Tolerant In-Place Consensus Sequences

Building consensus sequences based on distributed, fault-tolerant consen...

BRAINSTORMING: Consensus Learning in Practice

We present here an introduction to Brainstorming approach, that was rece...

3D Real-Time Supercomputer Monitoring

Supercomputers are complex systems producing vast quantities of performa...

Premise selection with neural networks and distributed representation of features

We present the problem of selecting relevant premises for a proof of a g...

Please sign up or login with your details

Forgot password? Click here to reset