Interneurons accelerate learning dynamics in recurrent neural networks for statistical adaptation

09/21/2022
by   David Lipshutz, et al.
0

Early sensory systems in the brain rapidly adapt to fluctuating input statistics, which requires recurrent communication between neurons. Mechanistically, such recurrent communication is often indirect and mediated by local interneurons. In this work, we explore the computational benefits of mediating recurrent communication via interneurons compared with direct recurrent connections. To this end, we consider two mathematically tractable recurrent neural networks that statistically whiten their inputs – one with direct recurrent connections and the other with interneurons that mediate recurrent communication. By analyzing the corresponding continuous synaptic dynamics and numerically simulating the networks, we show that the network with interneurons is more robust to initialization than the network with direct recurrent connections in the sense that the convergence time for the synaptic dynamics in the network with interneurons (resp. direct recurrent connections) scales logarithmically (resp. linearly) with the spectrum of their initialization. Our results suggest that interneurons are computationally useful for rapid adaptation to changing input statistics. Interestingly, the network with interneurons is an overparameterized solution of the whitening objective for the network with direct recurrent connections, so our results can be viewed as a recurrent neural network analogue of the implicit acceleration phenomenon observed in overparameterized feedforward linear networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/20/2020

Implicit recurrent networks: A novel approach to stationary input processing with recurrent neural networks in deep learning

The brain cortex, which processes visual, auditory and sensory data in t...
research
09/25/2017

Robust Associative Memories Naturally Occuring From Recurrent Hebbian Networks Under Noise

The brain is a noisy system subject to energy constraints. These facts a...
research
06/06/2016

Feedforward Initialization for Fast Inference of Deep Generative Networks is biologically plausible

We consider deep multi-layered generative models such as Boltzmann machi...
research
11/14/2020

Using noise to probe recurrent neural network structure and prune synapses

Many networks in the brain are sparsely connected, and the brain elimina...
research
10/06/2022

A Step Towards Uncovering The Structure of Multistable Neural Networks

We study the structure of multistable recurrent neural networks. The act...
research
08/25/2023

Adaptive whitening with fast gain modulation and slow synaptic plasticity

Neurons in early sensory areas rapidly adapt to changing sensory statist...
research
01/27/2023

Statistical whitening of neural populations with gain-modulating interneurons

Statistical whitening transformations play a fundamental role in many co...

Please sign up or login with your details

Forgot password? Click here to reset