Unsupervised Learning by Competing Hidden Units

06/26/2018
by   Dmitry Krotov, et al.
0

It is widely believed that the backpropagation algorithm is essential for learning good feature detectors in early layers of artificial neural networks, so that these detectors are useful for the task performed by the higher layers of that neural network. At the same time, the traditional form of backpropagation is biologically implausible. In the present paper we propose an unusual learning rule, which has a degree of biological plausibility, and which is motivated by Hebb's idea that change of the synapse strength should be local - i.e. should depend only on the activities of the pre and post synaptic neurons. We design a learning algorithm that utilizes global inhibition in the hidden layer, and is capable of learning early feature detectors in a completely unsupervised way. These learned lower layer feature detectors can be used to train higher layer weights in a usual supervised way so that the performance of the full network is comparable to the performance of standard feedforward networks trained end-to-end with a backpropagation algorithm.

READ FULL TEXT
research
02/27/2019

Biologically plausible deep learning -- but how far can we go with shallow networks?

Training deep neural networks with the error backpropagation algorithm i...
research
09/26/2022

Activation Learning by Local Competitions

The backpropagation that drives the success of deep learning is most lik...
research
05/08/2019

Unsupervised Learning through Temporal Smoothing and Entropy Maximization

This paper proposes a method for machine learning from unlabeled data in...
research
11/17/2017

Deep supervised learning using local errors

Error backpropagation is a highly effective mechanism for learning high-...
research
06/06/2022

Stacked unsupervised learning with a network architecture found by supervised meta-learning

Stacked unsupervised learning (SUL) seems more biologically plausible th...
research
06/13/2019

Associated Learning: Decomposing End-to-end Backpropagation based on Auto-encoders and Target Propagation

Backpropagation has been widely used in deep learning approaches, but it...
research
10/16/2020

Towards truly local gradients with CLAPP: Contrastive, Local And Predictive Plasticity

Back-propagation (BP) is costly to implement in hardware and implausible...

Please sign up or login with your details

Forgot password? Click here to reset