Continual Learning with Echo State Networks

05/17/2021
by   Andrea Cossu, et al.
39

Continual Learning (CL) refers to a learning setup where data is non stationary and the model has to learn without forgetting existing knowledge. The study of CL for sequential patterns revolves around trained recurrent networks. In this work, instead, we introduce CL in the context of Echo State Networks (ESNs), where the recurrent component is kept fixed. We provide the first evaluation of catastrophic forgetting in ESNs and we highlight the benefits in using CL strategies which are not applicable to trained recurrent models. Our results confirm the ESN as a promising model for CL and open to its use in streaming scenarios.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/21/2021

Wide Neural Networks Forget Less Catastrophically

A growing body of research in continual learning is devoted to overcomin...
research
04/15/2021

Rehearsal revealed: The limits and merits of revisiting samples in continual learning

Learning from non-stationary data streams and overcoming catastrophic fo...
research
07/09/2020

Graph-Based Continual Learning

Despite significant advances, continual learning models still suffer fro...
research
08/20/2021

Continual Learning for Image-Based Camera Localization

For several emerging technologies such as augmented reality, autonomous ...
research
12/06/2021

CSG0: Continual Urban Scene Generation with Zero Forgetting

With the rapid advances in generative adversarial networks (GANs), the v...
research
12/17/2020

Incremental Learning from Low-labelled Stream Data in Open-Set Video Face Recognition

Deep Learning approaches have brought solutions, with impressive perform...
research
07/13/2020

RATT: Recurrent Attention to Transient Tasks for Continual Image Captioning

Research on continual learning has led to a variety of approaches to mit...

Please sign up or login with your details

Forgot password? Click here to reset