Biologically Plausible Online Principal Component Analysis Without Recurrent Neural Dynamics

10/16/2018
by   Victor Minden, et al.
0

Artificial neural networks that learn to perform Principal Component Analysis (PCA) and related tasks using strictly local learning rules have been previously derived based on the principle of similarity matching: similar pairs of inputs should map to similar pairs of outputs. However, the operation of these networks (and of similar networks) requires a fixed-point iteration to determine the output corresponding to a given input, which means that dynamics must operate on a faster time scale than the variation of the input. Further, during these fast dynamics such networks typically "disable" learning, updating synaptic weights only once the fixed-point iteration has been resolved. Here, we derive a network for PCA-based dimensionality reduction that avoids this fast fixed-point iteration. The key novelty of our approach is a modification of the similarity matching objective to encourage near-diagonality of a synaptic weight matrix. We then approximately invert this matrix using a Taylor series approximation, replacing the previous fast iterations. In the offline setting, our algorithm corresponds to a dynamical system, the stability of which we rigorously analyze. In the online setting (i.e., with stochastic gradients), we map our algorithm to a familiar neural network architecture and give numerical results showing that our method converges at a competitive rate. The computational complexity per iteration of our online algorithm is linear in the total degrees of freedom, which is in some sense optimal.

READ FULL TEXT
research
11/30/2015

A Normative Theory of Adaptive Dimensionality Reduction in Neural Networks

To make sense of the world our brains must analyze high-dimensional data...
research
03/02/2015

A Hebbian/Anti-Hebbian Neural Network for Linear Subspace Learning: A Derivation from Multidimensional Scaling of Streaming Data

Neural network models of early sensory processing typically reduce the d...
research
06/20/2021

Distributed Picard Iteration: Application to Distributed EM and Distributed PCA

In recent work, we proposed a distributed Picard iteration (DPI) that al...
research
11/30/2015

Optimization theory of Hebbian/anti-Hebbian networks for PCA and whitening

In analyzing information streamed by sensory organs, our brains face cha...
research
06/06/2016

Feedforward Initialization for Fast Inference of Deep Generative Networks is biologically plausible

We consider deep multi-layered generative models such as Boltzmann machi...
research
07/10/2017

Accelerated Stochastic Power Iteration

Principal component analysis (PCA) is one of the most powerful tools in ...
research
10/22/2021

Multiplication-Avoiding Variant of Power Iteration with Applications

Power iteration is a fundamental algorithm in data analysis. It extracts...

Please sign up or login with your details

Forgot password? Click here to reset