DeepAI AI Chat
Log In Sign Up

Diffusion-based neuromodulation can eliminate catastrophic forgetting in simple neural networks

by   Roby Velez, et al.
University of Wyoming

A long-term goal of AI is to produce agents that can learn a diversity of skills throughout their lifetimes and continuously improve those skills via experience. A longstanding obstacle towards that goal is catastrophic forgetting, which is when learning new information erases previously learned information. Catastrophic forgetting occurs in artificial neural networks (ANNs), which have fueled most recent advances in AI. A recent paper proposed that catastrophic forgetting in ANNs can be reduced by promoting modularity, which can limit forgetting by isolating task information to specific clusters of nodes and connections (functional modules). While the prior work did show that modular ANNs suffered less from catastrophic forgetting, it was not able to produce ANNs that possessed task-specific functional modules, thereby leaving the main theory regarding modularity and forgetting untested. We introduce diffusion-based neuromodulation, which simulates the release of diffusing, neuromodulatory chemicals within an ANN that can modulate (i.e. up or down regulate) learning in a spatial region. On the simple diagnostic problem from the prior work, diffusion-based neuromodulation 1) induces task-specific learning in groups of nodes and connections (task-specific localized learning), which 2) produces functional modules for each subtask, and 3) yields higher performance by eliminating catastrophic forgetting. Overall, our results suggest that diffusion-based neuromodulation promotes task-specific localized learning and functional modularity, which can help solve the challenging, but important problem of catastrophic forgetting.


page 10

page 18

page 21

page 22

page 23

page 24

page 25

page 26


Catastrophic Importance of Catastrophic Forgetting

This paper describes some of the possibilities of artificial neural netw...

Detecting Information Relays in Deep Neural Networks

Deep learning of artificial neural networks (ANNs) is creating highly fu...

Combating catastrophic forgetting with developmental compression

Generally intelligent agents exhibit successful behavior across problems...

Does Standard Backpropagation Forget Less Catastrophically Than Adam?

Catastrophic forgetting remains a severe hindrance to the broad applicat...

An Empirical Investigation of Catastrophic Forgetting in Gradient-Based Neural Networks

Catastrophic forgetting is a problem faced by many machine learning mode...

Reducing Catastrophic Forgetting in Modular Neural Networks by Dynamic Information Balancing

Lifelong learning is a very important step toward realizing robust auton...

Continual Deep Learning by Functional Regularisation of Memorable Past

Continually learning new skills is important for intelligent systems, ye...