Factorizers for Distributed Sparse Block Codes

03/24/2023
by   Michael Hersche, et al.
0

Distributed sparse block codes (SBCs) exhibit compact representations for encoding and manipulating symbolic data structures using fixed-with vectors. One major challenge however is to disentangle, or factorize, such data structures into their constituent elements without having to search through all possible combinations. This factorization becomes more challenging when queried by noisy SBCs wherein symbol representations are relaxed due to perceptual uncertainty and approximations made when modern neural networks are used to generate the query vectors. To address these challenges, we first propose a fast and highly accurate method for factorizing a more flexible and hence generalized form of SBCs, dubbed GSBCs. Our iterative factorizer introduces a threshold-based nonlinear activation, a conditional random sampling, and an ℓ_∞-based similarity metric. Its random sampling mechanism in combination with the search in superposition allows to analytically determine the expected number of decoding iterations, which matches the empirical observations up to the GSBC's bundling capacity. Secondly, the proposed factorizer maintains its high accuracy when queried by noisy product vectors generated using deep convolutional neural networks (CNNs). This facilitates its application in replacing the large fully connected layer (FCL) in CNNs, whereby C trainable class vectors, or attribute combinations, can be implicitly represented by our factorizer having F-factor codebooks, each with √(C) fixed codevectors. We provide a methodology to flexibly integrate our factorizer in the classification layer of CNNs with a novel loss function. We demonstrate the feasibility of our method on four deep CNN architectures over CIFAR-100, ImageNet-1K, and RAVEN datasets. In all use cases, the number of parameters and operations are significantly reduced compared to the FCL.

READ FULL TEXT
research
09/14/2020

Variable Binding for Sparse Distributed Representations: Theory and Applications

Symbolic reasoning and neural networks are often considered incompatible...
research
12/31/2015

Exploiting Local Structures with the Kronecker Layer in Convolutional Networks

In this paper, we propose and study a technique to reduce the number of ...
research
07/07/2020

Resonator networks for factoring distributed representations of data structures

The ability to encode and manipulate data structures with distributed ne...
research
10/02/2019

A Pre-defined Sparse Kernel Based Convolutionfor Deep CNNs

The high demand for computational and storage resources severely impede ...
research
10/02/2019

A Pre-defined Sparse Kernel Based Convolution for Deep CNNs

The high demand for computational and storage resources severely impede ...
research
09/16/2019

A few filters are enough: Convolutional Neural Network for P300 Detection

In this paper, we aim to provide elements to contribute to the discussio...
research
02/22/2022

Roto-Translation Equivariant Super-Resolution of Two-Dimensional Flows Using Convolutional Neural Networks

Convolutional neural networks (CNNs) often process vectors as quantities...

Please sign up or login with your details

Forgot password? Click here to reset