Kronecker Factorization for Preventing Catastrophic Forgetting in Large-scale Medical Entity Linking

11/11/2021
by   Denis Jered McInerney, et al.
0

Multi-task learning is useful in NLP because it is often practically desirable to have a single model that works across a range of tasks. In the medical domain, sequential training on tasks may sometimes be the only way to train models, either because access to the original (potentially sensitive) data is no longer available, or simply owing to the computational costs inherent to joint retraining. A major issue inherent to sequential learning, however, is catastrophic forgetting, i.e., a substantial drop in accuracy on prior tasks when a model is updated for a new task. Elastic Weight Consolidation is a recently proposed method to address this issue, but scaling this approach to the modern large models used in practice requires making strong independence assumptions about model parameters, limiting its effectiveness. In this work, we apply Kronecker Factorization–a recent approach that relaxes independence assumptions–to prevent catastrophic forgetting in convolutional and Transformer-based neural networks at scale. We show the effectiveness of this technique on the important and illustrative task of medical entity linking across three datasets, demonstrating the capability of the technique to be used to make efficient updates to existing methods as new medical data becomes available. On average, the proposed method reduces catastrophic forgetting by 51 reduction using standard Elastic Weight Consolidation, while maintaining spatial complexity proportional to the number of model parameters.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/25/2019

Towards continuous learning for glioma segmentation with elastic weight consolidation

When finetuning a convolutional neural network (CNN) on data from a new ...
research
02/08/2018

Rotate your Networks: Better Weight Consolidation and Less Catastrophic Forgetting

In this paper we propose an approach to avoiding catastrophic forgetting...
research
11/21/2022

Towards continually learning new languages

Multilingual speech recognition with neural networks is often implemente...
research
05/18/2018

Overcoming catastrophic forgetting problem by weight consolidation and long-term memory

Sequential learning of multiple tasks in artificial neural networks usin...
research
10/10/2019

Learning to Remember from a Multi-Task Teacher

Recent studies on catastrophic forgetting during sequential learning typ...
research
04/12/2019

Incremental multi-domain learning with network latent tensor factorization

The prominence of deep learning, large amount of annotated data and incr...
research
03/27/2021

Addressing catastrophic forgetting for medical domain expansion

Model brittleness is a key concern when deploying deep learning models i...

Please sign up or login with your details

Forgot password? Click here to reset