Recurrent circuits as multi-path ensembles for modeling responses of early visual cortical neurons

10/02/2021
by   Yimeng Zhang, et al.
0

In this paper, we showed that adding within-layer recurrent connections to feed-forward neural network models could improve the performance of neural response prediction in early visual areas by up to 11 percent across different data sets and over tens of thousands of model configurations. To understand why recurrent models perform better, we propose that recurrent computation can be conceptualized as an ensemble of multiple feed-forward pathways of different lengths with shared parameters. By reformulating a recurrent model as a multi-path model and analyzing the recurrent model through its multi-path ensemble, we found that the recurrent model outperformed the corresponding feed-forward one due to the former's compact and implicit multi-path ensemble that allows approximating the complex function underlying recurrent biological circuits with efficiency. In addition, we found that the performance differences among recurrent models were highly correlated with the differences in their multi-path ensembles in terms of path lengths and path diversity; a balance of paths of different lengths in the ensemble was necessary for the model to achieve the best performance. Our studies shed light on the computational rationales and advantages of recurrent circuits for neural modeling and machine learning tasks in general.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/20/2020

Implicit recurrent networks: A novel approach to stationary input processing with recurrent neural networks in deep learning

The brain cortex, which processes visual, auditory and sensory data in t...
research
01/25/2019

A Neurally-Inspired Hierarchical Prediction Network for Spatiotemporal Sequence Learning and Prediction

In this paper we developed a hierarchical network model, called Hierarch...
research
08/12/2023

Revisiting Vision Transformer from the View of Path Ensemble

Vision Transformers (ViTs) are normally regarded as a stack of transform...
research
04/21/2017

Feed-forward approximations to dynamic recurrent network architectures

Recurrent neural network architectures can have useful computational pro...
research
10/13/2020

Unfolding recurrence by Green's functions for optimized reservoir computing

Cortical networks are strongly recurrent, and neurons have intrinsic tem...
research
10/16/2019

Adaptive and Iteratively Improving Recurrent Lateral Connections

The current leading computer vision models are typically feed forward ne...
research
06/07/2017

Recurrent computations for visual pattern completion

Making inferences from partial information constitutes a critical aspect...

Please sign up or login with your details

Forgot password? Click here to reset