Log In Sign Up

CCN GAC Workshop: Issues with learning in biological recurrent neural networks

by   Luke Y. Prince, et al.

This perspective piece came about through the Generative Adversarial Collaboration (GAC) series of workshops organized by the Computational Cognitive Neuroscience (CCN) conference in 2020. We brought together a number of experts from the field of theoretical neuroscience to debate emerging issues in our understanding of how learning is implemented in biological recurrent neural networks. Here, we will give a brief review of the common assumptions about biological learning and the corresponding findings from experimental neuroscience and contrast them with the efficiency of gradient-based learning in recurrent neural networks commonly used in artificial intelligence. We will then outline the key issues discussed in the workshop: synaptic plasticity, neural circuits, theory-experiment divide, and objective functions. Finally, we conclude with recommendations for both theoretical and experimental neuroscientists when designing new studies that could help to bring clarity to these issues.


page 1

page 2

page 3

page 4


Emergent Computations in Trained Artificial Neural Networks and Real Brains

Synaptic plasticity allows cortical circuits to learn new tasks and to a...

A biological gradient descent for prediction through a combination of STDP and homeostatic plasticity

Identifying, formalizing and combining biological mechanisms which imple...

The Global Structure of Codimension-2 Local Bifurcations in Continuous-Time Recurrent Neural Networks

If we are ever to move beyond the study of isolated special cases in the...

A learning gap between neuroscience and reinforcement learning

Historically, artificial intelligence has drawn much inspiration from ne...

Proceedings of the Workshop on Brain Analysis using COnnectivity Networks - BACON 2016

Understanding brain connectivity in a network-theoretic context has show...

Fine-Grained System Identification of Nonlinear Neural Circuits

We study the problem of sparse nonlinear model recovery of high dimensio...

Understanding Attention: In Minds and Machines

Attention is a complex and broad concept, studied across multiple discip...