Efficient Continual Learning in Neural Networks with Embedding Regularization

09/09/2019
by   Jary Pomponi, et al.
0

Continual learning of deep neural networks is a key requirement for scaling them up to more complex applicative scenarios and for achieving real lifelong learning of these architectures. Previous approaches to the problem have considered either the progressive increase in the size of the networks, or have tried to regularize the network behavior to equalize it with respect to previously observed tasks. In the latter case, it is essential to understand what type of information best represents this past behavior. Common techniques include regularizing the past outputs, gradients, or individual weights. In this work, we propose a new, relatively simple and efficient method to perform continual learning by regularizing instead the network internal embeddings. To make the approach scalable, we also propose a dynamic sampling strategy to reduce the memory footprint of the required external storage. We show that our method performs favorably with respect to state-of-the-art approaches in the literature, while requiring significantly less space in memory and computational time. In addition, inspired inspired by to recent works, we evaluate the impact of selecting a more flexible model for the activation functions inside the network, evaluating the impact of catastrophic forgetting on the activation functions themselves.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/05/2020

Pseudo-Rehearsal for Continual Learning with Normalizing Flows

Catastrophic forgetting (CF) happens whenever a neural network overwrite...
research
02/11/2022

Continual Learning with Invertible Generative Models

Catastrophic forgetting (CF) happens whenever a neural network overwrite...
research
05/06/2021

Structured Ensembles: an Approach to Reduce the Memory Footprint of Ensemble Methods

In this paper, we propose a novel ensembling technique for deep neural n...
research
08/03/2022

Centroids Matching: an efficient Continual Learning approach operating in the embedding space

Catastrophic forgetting (CF) occurs when a neural network loses the info...
research
02/26/2020

Metaplasticity in Multistate Memristor Synaptic Networks

Recent studies have shown that metaplastic synapses can retain informati...
research
05/28/2021

More Is Better: An Analysis of Instance Quantity/Quality Trade-off in Rehearsal-based Continual Learning

The design of machines and algorithms capable of learning in a dynamical...
research
04/19/2019

Continual Learning with Self-Organizing Maps

Despite remarkable successes achieved by modern neural networks in a wid...

Please sign up or login with your details

Forgot password? Click here to reset