DeepAI AI Chat
Log In Sign Up

Structured Compression and Sharing of Representational Space for Continual Learning

01/23/2020
by   Gobinda Saha, et al.
Purdue University
9

Humans are skilled at learning adaptively and efficiently throughout their lives, but learning tasks incrementally causes artificial neural networks to overwrite relevant information learned about older tasks, resulting in 'Catastrophic Forgetting'. Efforts to overcome this phenomenon suffer from poor utilization of resources in many ways, such as through the need to save older data or parametric importance scores, or to grow the network architecture. We propose an algorithm that enables a network to learn continually and efficiently by partitioning the representational space into a Core space, that contains the condensed information from previously learned tasks, and a Residual space, which is akin to a scratch space for learning the current task. The information in the Residual space is then compressed using Principal Component Analysis and added to the Core space, freeing up parameters for the next task. We evaluate our algorithm on P-MNIST, CIFAR-10 and CIFAR-100 datasets. We achieve comparable accuracy to state-of-the-art methods while overcoming the problem of catastrophic forgetting completely. Additionally, we get up to 4.5x improvement in energy efficiency during inference due to the structured nature of the resulting architecture.

READ FULL TEXT
02/20/2018

Continual Reinforcement Learning with Complex Synapses

Unlike humans, who are capable of continual learning over their lifetime...
10/09/2019

Continual Learning Using Bayesian Neural Networks

Continual learning models allow to learn and adapt to new changes and ta...
06/22/2019

Beneficial perturbation network for continual learning

Sequential learning of multiple tasks in artificial neural networks usin...
07/15/2020

SpaceNet: Make Free Space For Continual Learning

The continual learning (CL) paradigm aims to enable neural networks to l...
10/07/2020

A Theoretical Analysis of Catastrophic Forgetting through the NTK Overlap Matrix

Continual learning (CL) is a setting in which an agent has to learn from...
06/15/2021

Bridge Networks

Despite rapid progress, current deep learning methods face a number of c...
02/12/2018

Pseudo-Recursal: Solving the Catastrophic Forgetting Problem in Deep Neural Networks

In general, neural networks are not currently capable of learning tasks ...

Code Repositories

CL_PCA

Official PyTorch implementation of "Structured Compression and Sharing of Representational Space for Continual Learning" paper.


view repo