Do What Nature Did To Us: Evolving Plastic Recurrent Neural Networks For Task Generalization

09/08/2021
by   Fan Wang, et al.
0

While artificial neural networks (ANNs) have been widely adopted in machine learning, researchers are increasingly obsessed by the gaps between ANNs and biological neural networks (BNNs). In this paper, we propose a framework named as Evolutionary Plastic Recurrent Neural Networks (EPRNN). Inspired by BNN, EPRNN composes Evolution Strategies, Plasticity Rules, and Recursion-based Learning all in one meta learning framework for generalization to different tasks. More specifically, EPRNN incorporates with nested loops for meta learning – an outer loop searches for optimal initial parameters of the neural network and learning rules; an inner loop adapts to specific tasks. In the inner loop of EPRNN, we effectively attain both long term memory and short term memory by forging plasticity with recursion-based learning mechanisms, both of which are believed to be responsible for memristance in BNNs. The inner-loop setting closely simulate that of BNNs, which neither query from any gradient oracle for optimization nor require the exact forms of learning objectives. To evaluate the performance of EPRNN, we carry out extensive experiments in two groups of tasks: Sequence Predicting, and Wheeled Robot Navigating. The experiment results demonstrate the unique advantage of EPRNN compared to state-of-the-arts based on plasticity and recursion while yielding comparably good performance against deep learning based approaches in the tasks. The experiment results suggest the potential of EPRNN to generalize to variety of tasks and encourage more efforts in plasticity and recursion based learning mechanisms.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/31/2023

MetaDiff: Meta-Learning with Conditional Diffusion for Few-Shot Learning

Equipping a deep model the abaility of few-shot learning, i.e., learning...
research
04/04/2023

Meta-Learning with a Geometry-Adaptive Preconditioner

Model-agnostic meta-learning (MAML) is one of the most successful meta-l...
research
02/01/2021

Meta-learning with negative learning rates

Deep learning models require a large amount of data to perform well. Whe...
research
07/31/2020

L^2C – Learning to Learn to Compress

In this paper we present an end-to-end meta-learned system for image com...
research
06/29/2021

MAML is a Noisy Contrastive Learner

Model-agnostic meta-learning (MAML) is one of the most popular and widel...
research
09/03/2015

Training of CC4 Neural Network with Spread Unary Coding

This paper adapts the corner classification algorithm (CC4) to train the...
research
03/06/2020

Finding online neural update rules by learning to remember

We investigate learning of the online local update rules for neural acti...

Please sign up or login with your details

Forgot password? Click here to reset