Hebbian Semi-Supervised Learning in a Sample Efficiency Setting

03/16/2021
by   Gabriele Lagani, et al.
0

We propose to address the issue of sample efficiency, in Deep Convolutional Neural Networks (DCNN), with a semisupervised training strategy that combines Hebbian learning with gradient descent: all internal layers (both convolutional and fully connected) are pre-trained using an unsupervised approach based on Hebbian learning, and the last fully connected layer (the classification layer) is using Stochastic Gradient Descent (SGD). In fact, as Hebbian learning is an unsupervised learning method, its potential lies in the possibility of training the internal layers of a DCNN without labeled examples. Only the final fully connected layer has to be trained with labeled examples. We performed experiments on various object recognition datasets, in different regimes of sample efficiency, comparing our semi-supervised (Hebbian for internal layers + SGD for the final fully layer) approach with end-to-end supervised backpropagation training. The results show that, in regimes where the number of available labeled samples is low, our semi-supervised approach outperforms full backpropagation in almost all the cases.

READ FULL TEXT
research
03/10/2018

Attention-based Graph Neural Network for Semi-supervised Learning

Recently popularized graph neural networks achieve the state-of-the-art ...
research
05/24/2022

An interpretation of the final fully connected layer

In recent years neural networks have achieved state-of-the-art accuracy ...
research
05/24/2017

Unsupervised Learning Layers for Video Analysis

This paper presents two unsupervised learning layers (UL layers) for lab...
research
02/27/2017

DeepNAT: Deep Convolutional Neural Network for Segmenting Neuroanatomy

We introduce DeepNAT, a 3D Deep convolutional neural network for the aut...
research
05/18/2022

Deep Features for CBIR with Scarce Data using Hebbian Learning

Features extracted from Deep Neural Networks (DNNs) have proven to be ve...
research
10/11/2022

The Unreasonable Effectiveness of Fully-Connected Layers for Low-Data Regimes

Convolutional neural networks were the standard for solving many compute...
research
07/07/2022

FastHebb: Scaling Hebbian Training of Deep Neural Networks to ImageNet Level

Learning algorithms for Deep Neural Networks are typically based on supe...

Please sign up or login with your details

Forgot password? Click here to reset