Stacked unsupervised learning with a network architecture found by supervised meta-learning

06/06/2022
by   Kyle Luther, et al.
0

Stacked unsupervised learning (SUL) seems more biologically plausible than backpropagation, because learning is local to each layer. But SUL has fallen far short of backpropagation in practical applications, undermining the idea that SUL can explain how brains learn. Here we show an SUL algorithm that can perform completely unsupervised clustering of MNIST digits with comparable accuracy relative to unsupervised algorithms based on backpropagation. Our algorithm is exceeded only by self-supervised methods requiring training data augmentation by geometric distortions. The only prior knowledge in our unsupervised algorithm is implicit in the network architecture. Multiple convolutional "energy layers" contain a sum-of-squares nonlinearity, inspired by "energy models" of primary visual cortex. Convolutional kernels are learned with a fast minibatch implementation of the K-Subspaces algorithm. High accuracy requires preprocessing with an initial whitening layer, representations that are less sparse during inference than learning, and rescaling for gain control. The hyperparameters of the network architecture are found by supervised meta-learning, which optimizes unsupervised clustering accuracy. We regard such dependence of unsupervised learning on prior knowledge implicit in network architecture as biologically plausible, and analogous to the dependence of brain architecture on evolutionary history.

READ FULL TEXT

page 4

page 6

page 7

research
02/27/2019

Biologically plausible deep learning -- but how far can we go with shallow networks?

Training deep neural networks with the error backpropagation algorithm i...
research
02/13/2023

Online Arbitrary Shaped Clustering through Correlated Gaussian Functions

There is no convincing evidence that backpropagation is a biologically p...
research
10/04/2018

Unsupervised Learning via Meta-Learning

A central goal of unsupervised learning is to acquire representations fr...
research
06/26/2018

Unsupervised Learning by Competing Hidden Units

It is widely believed that the backpropagation algorithm is essential fo...
research
09/30/2022

Minimalistic Unsupervised Learning with the Sparse Manifold Transform

We describe a minimalistic and interpretable method for unsupervised lea...
research
09/30/2021

Biologically Plausible Training Mechanisms for Self-Supervised Learning in Deep Networks

We develop biologically plausible training mechanisms for self-supervise...
research
03/31/2018

Learning Unsupervised Learning Rules

A major goal of unsupervised learning is to discover data representation...

Please sign up or login with your details

Forgot password? Click here to reset