Towards continuous learning for glioma segmentation with elastic weight consolidation

09/25/2019
by   Karin van Garderen, et al.
0

When finetuning a convolutional neural network (CNN) on data from a new domain, catastrophic forgetting will reduce performance on the original training data. Elastic Weight Consolidation (EWC) is a recent technique to prevent this, which we evaluated while training and re-training a CNN to segment glioma on two different datasets. The network was trained on the public BraTS dataset and finetuned on an in-house dataset with non-enhancing low-grade glioma. EWC was found to decrease catastrophic forgetting in this case, but was also found to restrict adaptation to the new domain.

READ FULL TEXT
research
11/21/2022

Towards continually learning new languages

Multilingual speech recognition with neural networks is often implemente...
research
02/08/2018

Rotate your Networks: Better Weight Consolidation and Less Catastrophic Forgetting

In this paper we propose an approach to avoiding catastrophic forgetting...
research
11/11/2021

Kronecker Factorization for Preventing Catastrophic Forgetting in Large-scale Medical Entity Linking

Multi-task learning is useful in NLP because it is often practically des...
research
03/27/2021

Addressing catastrophic forgetting for medical domain expansion

Model brittleness is a key concern when deploying deep learning models i...
research
04/30/2022

Operational Adaptation of DNN Classifiers using Elastic Weight Consolidation

Autonomous systems (AS) often use Deep Neural Network (DNN) classifiers ...
research
12/11/2017

On Quadratic Penalties in Elastic Weight Consolidation

Elastic weight consolidation (EWC, Kirkpatrick et al, 2017) is a novel a...
research
07/06/2020

Dynamic memory to alleviate catastrophic forgetting in continuous learning settings

In medical imaging, technical progress or changes in diagnostic procedur...

Please sign up or login with your details

Forgot password? Click here to reset