KCNet: An Insect-Inspired Single-Hidden-Layer Neural Network with Randomized Binary Weights for Prediction and Classification Tasks

08/17/2021
by   Jinyung Hong, et al.
7

Fruit flies are established model systems for studying olfactory learning as they will readily learn to associate odors with both electric shock or sugar rewards. The mechanisms of the insect brain apparently responsible for odor learning form a relatively shallow neuronal architecture. Olfactory inputs are received by the antennal lobe (AL) of the brain, which produces an encoding of each odor mixture across  50 sub-units known as glomeruli. Each of these glomeruli then project its component of this feature vector to several of  2000 so-called Kenyon Cells (KCs) in a region of the brain known as the mushroom body (MB). Fly responses to odors are generated by small downstream neuropils that decode the higher-order representation from the MB. Research has shown that there is no recognizable pattern in the glomeruli–KC connections (and thus the particular higher-order representations); they are akin to fingerprints – even isogenic flies have different projections. Leveraging insights from this architecture, we propose KCNet, a single-hidden-layer neural network that contains sparse, randomized, binary weights between the input layer and the hidden layer and analytically learned weights between the hidden layer and the output layer. Furthermore, we also propose a dynamic optimization algorithm that enables the KCNet to increase performance beyond its structural limits by searching a more efficient set of inputs. For odorant-perception tasks that predict perceptual properties of an odorant, we show that KCNet outperforms existing data-driven approaches, such as XGBoost. For image-classification tasks, KCNet achieves reasonable performance on benchmark datasets (MNIST, Fashion-MNIST, and EMNIST) without any data-augmentation methods or convolutional layers and shows particularly fast running time. Thus, neural networks inspired by the insect brain can be both economical and perform well.

READ FULL TEXT

page 17

page 20

page 21

page 23

page 25

page 26

page 27

research
09/11/2021

An Insect-Inspired Randomly, Weighted Neural Network with Random Fourier Features For Neuro-Symbolic Relational Learning

Insects, such as fruit flies and honey bees, can solve simple associativ...
research
04/30/2020

Binary autoencoder with random binary weights

Here is presented an analysis of an autoencoder with binary activations ...
research
03/12/2015

Training Binary Multilayer Neural Networks for Image Classification using Expectation Backpropagation

Compared to Multilayer Neural Networks with real weights, Binary Multila...
research
08/25/2015

An analysis of numerical issues in neural training by pseudoinversion

Some novel strategies have recently been proposed for single hidden laye...
research
07/05/2019

Prior Activation Distribution (PAD): A Versatile Representation to Utilize DNN Hidden Units

In this paper, we introduce the concept of Prior Activation Distribution...
research
10/02/2020

Are Artificial Dendrites useful in NeuroEvolution?

The significant role of dendritic processing within neuronal networks ha...

Please sign up or login with your details

Forgot password? Click here to reset