Continual Prototype Evolution: Learning Online from Non-Stationary Data Streams

09/02/2020
by   Matthias De Lange, et al.
6

As learning from non-stationary streams of data has been proven a challenging endeavour, current continual learners often strongly relax the problem, assuming balanced datasets, unlimited processing of data stream subsets, and additional availability of task information, sometimes even during inference. In contrast, our continual learner processes the data streams in an online fashion, without additional task-information, and shows solid robustness to imbalanced data streams resembling a real-world setting. Defying such challenging settings is achieved by aggregating prototypes and nearest-neighbour based classification in a shared latent space, where a Continual Prototype Evolution (CoPE) enables learning and prediction at any point in time. As the embedding network continually changes, prototypes inevitably become obsolete, which we prevent by replay of exemplars from memory. We obtain state-of-the-art performance by a significant margin on five benchmarks, including two highly unbalanced data streams.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/29/2023

Improving Online Continual Learning Performance and Stability with Temporal Ensembles

Neural networks are very effective when trained on large datasets for a ...
research
11/17/2017

Scalable Recollections for Continual Lifelong Learning

Given the recent success of Deep Learning applied to a variety of single...
research
10/14/2022

Sequential Learning Of Neural Networks for Prequential MDL

Minimum Description Length (MDL) provides a framework and an objective f...
research
01/18/2022

Continual Learning for CTR Prediction: A Hybrid Approach

Click-through rate(CTR) prediction is a core task in cost-per-click(CPC)...
research
05/24/2019

Continual Reinforcement Learning in 3D Non-stationary Environments

High-dimensional always-changing environments constitute a hard challeng...
research
08/07/2018

Deep Stacked Stochastic Configuration Networks for Non-Stationary Data Streams

The concept of stochastic configuration networks (SCNs) others a solid f...
research
07/22/2022

Discrete Key-Value Bottleneck

Deep neural networks perform well on prediction and classification tasks...

Please sign up or login with your details

Forgot password? Click here to reset