A biologically plausible neural network for Slow Feature Analysis

10/23/2020
by   David Lipshutz, et al.
0

Learning latent features from time series data is an important problem in both machine learning and brain function. One approach, called Slow Feature Analysis (SFA), leverages the slowness of many salient features relative to the rapidly varying input signals. Furthermore, when trained on naturalistic stimuli, SFA reproduces interesting properties of cells in the primary visual cortex and hippocampus, suggesting that the brain uses temporal slowness as a computational principle for learning latent features. However, despite the potential relevance of SFA for modeling brain function, there is currently no SFA algorithm with a biologically plausible neural network implementation, by which we mean an algorithm operates in the online setting and can be mapped onto a neural network with local synaptic updates. In this work, starting from an SFA objective, we derive an SFA algorithm, called Bio-SFA, with a biologically plausible neural network implementation. We validate Bio-SFA on naturalistic stimuli.

READ FULL TEXT

page 8

page 9

page 17

research
10/01/2020

A biologically plausible neural network for multi-channel Canonical Correlation Analysis

Cortical pyramidal neurons receive inputs from multiple distinct neural ...
research
11/30/2020

A biologically plausible neural network for local supervision in cortical microcircuits

The backpropagation algorithm is an invaluable tool for training artific...
research
02/10/2021

A Similarity-preserving Neural Network Trained on Transformed Images Recapitulates Salient Features of the Fly Motion Detection Circuit

Learning to detect content-independent transformations from data is one ...
research
06/02/2022

Beyond accuracy: generalization properties of bio-plausible temporal credit assignment rules

To unveil how the brain learns, ongoing work seeks biologically-plausibl...
research
10/27/2021

Latent Equilibrium: A unified learning theory for arbitrarily fast computation with arbitrarily slow neurons

The response time of physical computational elements is finite, and neur...
research
09/23/2019

AHA! an 'Artificial Hippocampal Algorithm' for Episodic Machine Learning

The majority of ML research concerns slow, statistical learning of i.i.d...
research
11/11/2021

Does the Brain Infer Invariance Transformations from Graph Symmetries?

The invariance of natural objects under perceptual changes is possibly e...

Please sign up or login with your details

Forgot password? Click here to reset