A Supervised Modified Hebbian Learning Method On Feed-forward Neural Networks

12/11/2019
by   Rafi Qumsieh, et al.
0

In this paper, we present a new supervised learning algorithm that is based on the Hebbian learning algorithm in an attempt to offer a substitute for back propagation along with the gradient descent for a more biologically plausible method. The best performance for the algorithm was achieved when it was run on a feed-forward neural network with the MNIST handwritten digits data set reaching an accuracy of 70.4 data set.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/24/2017

A Neural Network model with Bidirectional Whitening

We present here a new model and algorithm which performs an efficient Na...
research
07/01/2015

Natural Neural Networks

We introduce Natural Neural Networks, a novel family of algorithms that ...
research
04/17/2018

Joint Quantizer Optimization based on Neural Quantizer for Sum-Product Decoder

A low-precision analog-to-digital converter (ADC) is required to impleme...
research
10/30/2017

A Connection between Feed-Forward Neural Networks and Probabilistic Graphical Models

Two of the most popular modelling paradigms in computer vision are feed-...
research
09/12/2012

Training a Feed-forward Neural Network with Artificial Bee Colony Based Backpropagation Method

Back-propagation algorithm is one of the most widely used and popular te...
research
04/01/2019

Sound source ranging using a feed-forward neural network with fitting-based early stopping

When a feed-forward neural network (FNN) is trained for source ranging i...
research
09/17/2022

Introspective Learning : A Two-Stage Approach for Inference in Neural Networks

In this paper, we advocate for two stages in a neural network's decision...

Please sign up or login with your details

Forgot password? Click here to reset