Classification using Hyperdimensional Computing: A Review

04/19/2020
by   Lulu Ge, et al.
13

Hyperdimensional (HD) computing is built upon its unique data type referred to as hypervectors. The dimension of these hypervectors is typically in the range of tens of thousands. Proposed to solve cognitive tasks, HD computing aims at calculating similarity among its data. Data transformation is realized by three operations, including addition, multiplication and permutation. Its ultra-wide data representation introduces redundancy against noise. Since information is evenly distributed over every bit of the hypervectors, HD computing is inherently robust. Additionally, due to the nature of those three operations, HD computing leads to fast learning ability, high energy efficiency and acceptable accuracy in learning and classification tasks. This paper introduces the background of HD computing, and reviews the data representation, data transformation, and similarity measurement. The orthogonality in high dimensions presents opportunities for flexible computing. To balance the tradeoff between accuracy and efficiency, strategies include but are not limited to encoding, retraining, binarization and hardware acceleration. Evaluations indicate that HD computing shows great potential in addressing problems using data in the form of letters, signals and images. HD computing especially shows significant promise to replace machine learning algorithms as a light-weight classifier in the field of internet of things (IoTs).

READ FULL TEXT

page 2

page 3

page 7

page 11

page 13

page 14

page 15

page 16

research
11/27/2019

QubitHD: A Stochastic Acceleration Method for HD Computing-Based Machine Learning

Machine Learning algorithms based on Brain-inspired Hyperdimensional (HD...
research
05/16/2022

GraphHD: Efficient graph classification using hyperdimensional computing

Hyperdimensional Computing (HDC) developed by Kanerva is a computational...
research
01/19/2022

FAT: An In-Memory Accelerator with Fast Addition for Ternary Weight Neural Networks

Convolutional Neural Networks (CNNs) demonstrate great performance in va...
research
07/06/2023

TL-nvSRAM-CIM: Ultra-High-Density Three-Level ReRAM-Assisted Computing-in-nvSRAM with DC-Power Free Restore and Ternary MAC Operations

Accommodating all the weights on-chip for large-scale NNs remains a grea...
research
09/24/2019

Entropy from Machine Learning

We translate the problem of calculating the entropy of a set of binary c...
research
06/17/2021

Generalized Learning Vector Quantization for Classification in Randomized Neural Networks and Hyperdimensional Computing

Machine learning algorithms deployed on edge devices must meet certain r...
research
01/26/2023

Efficient Hyperdimensional Computing

Hyperdimensional computing (HDC) uses binary vectors of high dimensions ...

Please sign up or login with your details

Forgot password? Click here to reset