On Training Recurrent Neural Networks for Lifelong Learning

11/16/2018
by   Shagun Sodhani, et al.
0

Capacity saturation and catastrophic forgetting are the central challenges of any parametric lifelong learning system. In this work, we study these challenges in the context of sequential supervised learning with emphasis on recurrent neural networks. To evaluate the models in life-long learning setting, we propose a curriculum-based, simple, and intuitive benchmark where the models are trained on a task with increasing levels of difficulty. As a step towards developing true lifelong learning systems, we unify Gradient Episodic Memory (a catastrophic forgetting alleviation approach) and Net2Net (a capacity expansion approach). Evaluation on the proposed benchmark shows that the unified model is more suitable than the constituent models for lifelong learning setting.

READ FULL TEXT
research
08/08/2022

Towards lifelong learning of Recurrent Neural Networks for control design

This paper proposes a method for lifelong learning of Recurrent Neural N...
research
02/15/2021

Does Standard Backpropagation Forget Less Catastrophically Than Adam?

Catastrophic forgetting remains a severe hindrance to the broad applicat...
research
06/29/2023

The Importance of Robust Features in Mitigating Catastrophic Forgetting

Continual learning (CL) is an approach to address catastrophic forgettin...
research
12/21/2013

An Empirical Investigation of Catastrophic Forgetting in Gradient-Based Neural Networks

Catastrophic forgetting is a problem faced by many machine learning mode...
research
07/14/2020

Anatomy of Catastrophic Forgetting: Hidden Representations and Task Semantics

A central challenge in developing versatile machine learning systems is ...
research
06/20/2023

On Compositionality and Improved Training of NADO

NeurAlly-Decomposed Oracle (NADO) is a powerful approach for controllabl...
research
10/31/2016

Full-Capacity Unitary Recurrent Neural Networks

Recurrent neural networks are powerful models for processing sequential ...

Please sign up or login with your details

Forgot password? Click here to reset