The Role Of Biology In Deep Learning

09/07/2022
by   Robert Bain, et al.
0

Artificial neural networks took a lot of inspiration from their biological counterparts in becoming our best machine perceptual systems. This work summarizes some of that history and incorporates modern theoretical neuroscience into experiments with artificial neural networks from the field of deep learning. Specifically, iterative magnitude pruning is used to train sparsely connected networks with 33x fewer weights without loss in performance. These are used to test and ultimately reject the hypothesis that weight sparsity alone improves image noise robustness. Recent work mitigated catastrophic forgetting using weight sparsity, activation sparsity, and active dendrite modeling. This paper replicates those findings, and extends the method to train convolutional neural networks on a more challenging continual learning task. The code has been made publicly available.

READ FULL TEXT
research
11/23/2022

Cooperative data-driven modeling

Data-driven modeling in mechanics is evolving rapidly based on recent ma...
research
12/28/2021

Towards continual task learning in artificial neural networks: current approaches and insights from neuroscience

The innate capacity of humans and other animals to learn a diverse, and ...
research
03/20/2023

Sparse Distributed Memory is a Continual Learner

Continual learning is a problem for artificial neural networks that thei...
research
11/26/2021

Latent Space based Memory Replay for Continual Learning in Artificial Neural Networks

Memory replay may be key to learning in biological brains, which manage ...
research
04/30/2022

Engineering flexible machine learning systems by traversing functionally invariant paths in weight space

Deep neural networks achieve human-like performance on a variety of perc...
research
05/18/2022

Maslow's Hammer for Catastrophic Forgetting: Node Re-Use vs Node Activation

Continual learning - learning new tasks in sequence while maintaining pe...
research
08/01/2023

Understanding Activation Patterns in Artificial Neural Networks by Exploring Stochastic Processes

To gain a deeper understanding of the behavior and learning dynamics of ...

Please sign up or login with your details

Forgot password? Click here to reset