
A Hebbian/AntiHebbian Network Derived from Online NonNegative Matrix Factorization Can Cluster and Discover Sparse Features
Despite our extensive knowledge of biophysical properties of neurons, th...
read it

A Hebbian/AntiHebbian Neural Network for Linear Subspace Learning: A Derivation from Multidimensional Scaling of Streaming Data
Neural network models of early sensory processing typically reduce the d...
read it

Sparse Matrix Factorization
We investigate the problem of factorizing a matrix into several sparse m...
read it

Online matrix factorization for Markovian data and applications to Network Dictionary Learning
Online Matrix Factorization (OMF) is a fundamental tool for dictionary l...
read it

Learning spatiallycorrelated temporal dictionaries for calcium imaging
Calcium imaging has become a fundamental neural imaging technique, aimin...
read it

Local Information with Feedback Perturbation Suffices for Dictionary Learning in Neural Circuits
While the sparse coding principle can successfully model information pro...
read it

Dictionary Learning by Dynamical Neural Networks
A dynamical neural network consists of a set of interconnected neurons t...
read it
A Hebbian/AntiHebbian Network for Online Sparse Dictionary Learning Derived from Symmetric Matrix Factorization
Olshausen and Field (OF) proposed that neural computations in the primary visual cortex (V1) can be partially modeled by sparse dictionary learning. By minimizing the regularized representation error they derived an online algorithm, which learns Gaborfilter receptive fields from a natural image ensemble in agreement with physiological experiments. Whereas the OF algorithm can be mapped onto the dynamics and synaptic plasticity in a singlelayer neural network, the derived learning rule is nonlocal  the synaptic weight update depends on the activity of neurons other than just pre and postsynaptic ones  and hence biologically implausible. Here, to overcome this problem, we derive sparse dictionary learning from a novel costfunction  a regularized error of the symmetric factorization of the input's similarity matrix. Our algorithm maps onto a neural network of the same architecture as OF but using only biologically plausible local learning rules. When trained on natural images our network learns Gaborfilter receptive fields and reproduces the correlation among synaptic weights hardwired in the OF network. Therefore, online symmetric matrix factorization may serve as an algorithmic theory of neural computation.
READ FULL TEXT
Comments
There are no comments yet.