CLOPS: Continual Learning of Physiological Signals

04/20/2020
by   Dani Kiyasseh, et al.
0

Deep learning algorithms are known to experience destructive interference when instances violate the assumption of being independent and identically distributed (i.i.d). This violation, however, is ubiquitous in clinical settings where data are streamed temporally and from a multitude of physiological sensors. To overcome this obstacle, we propose CLOPS, a healthcare-specific replay-based continual learning strategy. In three continual learning scenarios based on three publically-available datasets, we show that CLOPS can outperform its multi-task learning counterpart. Moreover, we propose end-to-end trainable parameters, which we term task-instance parameters, that can be used to quantify task difficulty and similarity. This quantification yields insights into both network interpretability and clinical applications, where task difficulty is poorly quantified.

READ FULL TEXT

page 8

page 9

research
10/26/2022

Is Multi-Task Learning an Upper Bound for Continual Learning?

Continual and multi-task learning are common machine learning approaches...
research
07/11/2020

Batch-level Experience Replay with Review for Continual Learning

Continual learning is a branch of deep learning that seeks to strike a b...
research
12/31/2020

Continual Learning in Task-Oriented Dialogue Systems

Continual learning in task-oriented dialogue systems can allow us to add...
research
10/22/2020

Continual Learning in Low-rank Orthogonal Subspaces

In continual learning (CL), a learner is faced with a sequence of tasks,...
research
01/29/2022

Learning Fast, Learning Slow: A General Continual Learning Method based on Complementary Learning System

Humans excel at continually learning from an ever-changing environment w...
research
09/01/2023

New metrics for analyzing continual learners

Deep neural networks have shown remarkable performance when trained on i...

Please sign up or login with your details

Forgot password? Click here to reset