DeepAI
Log In Sign Up

A Neural Network with Local Learning Rules for Minor Subspace Analysis

02/10/2021
by   Yanis Bahroun, et al.
0

The development of neuromorphic hardware and modeling of biological neural networks requires algorithms with local learning rules. Artificial neural networks using local learning rules to perform principal subspace analysis (PSA) and clustering have recently been derived from principled objective functions. However, no biologically plausible networks exist for minor subspace analysis (MSA), a fundamental signal processing task. MSA extracts the lowest-variance subspace of the input signal covariance matrix. Here, we introduce a novel similarity matching objective for extracting the minor subspace, Minor Subspace Similarity Matching (MSSM). Moreover, we derive an adaptive MSSM algorithm that naturally maps onto a novel neural network with local learning rules and gives numerical results showing that our method converges at a competitive rate.

READ FULL TEXT

page 1

page 2

page 3

page 4

page 5

11/30/2015

Optimization theory of Hebbian/anti-Hebbian networks for PCA and whitening

In analyzing information streamed by sensory organs, our brains face cha...
02/09/2014

MCA Learning Algorithm for Incident Signals Estimation: A Review

Recently there has been many works on adaptive subspace filtering in the...
03/23/2017

Why do similarity matching objectives lead to Hebbian/anti-Hebbian networks?

Modeling self-organization of neural networks for unsupervised learning ...
11/30/2015

A Normative Theory of Adaptive Dimensionality Reduction in Neural Networks

To make sense of the world our brains must analyze high-dimensional data...
05/03/2022

A Falsificationist Account of Artificial Neural Networks

Machine learning operates at the intersection of statistics and computer...
05/24/2020

Derivation of Symmetric PCA Learning Rules from a Novel Objective Function

Neural learning rules for principal component / subspace analysis (PCA /...