Lightweight Transformers for Clinical Natural Language Processing

02/09/2023
by   Omid Rohanian, et al.
0

Specialised pre-trained language models are becoming more frequent in NLP since they can potentially outperform models trained on generic texts. BioBERT and BioClinicalBERT are two examples of such models that have shown promise in medical NLP tasks. Many of these models are overparametrised and resource-intensive, but thanks to techniques like Knowledge Distillation (KD), it is possible to create smaller versions that perform almost as well as their larger counterparts. In this work, we specifically focus on development of compact language models for processing clinical texts (i.e. progress notes, discharge summaries etc). We developed a number of efficient lightweight clinical transformers using knowledge distillation and continual learning, with the number of parameters ranging from 15 million to 65 million. These models performed comparably to larger models such as BioBERT and ClinicalBioBERT and significantly outperformed other compact models trained on general or biomedical data. Our extensive evaluation was done across several standard datasets and covered a wide range of clinical text-mining tasks, including Natural Language Inference, Relation Extraction, Named Entity Recognition, and Sequence Classification. To our knowledge, this is the first comprehensive study specifically focused on creating efficient and compact transformers for clinical NLP tasks. The models and code used in this study can be found on our Huggingface profile at https://huggingface.co/nlpie and Github page at https://github.com/nlpie-research/Lightweight-Clinical-Transformers, respectively, promoting reproducibility of our results.

READ FULL TEXT
research
09/07/2022

On the Effectiveness of Compact Biomedical Transformers

Language models pre-trained on biomedical corpora, such as BioBERT, have...
research
01/27/2023

A Comparative Study of Pretrained Language Models for Long Clinical Text

Objective: Clinical knowledge enriched transformer models (e.g., Clinica...
research
10/12/2022

MiniALBERT: Model Distillation via Parameter-Efficient Recursive Transformers

Pre-trained Language Models (LMs) have become an integral part of Natura...
research
02/03/2023

Bioformer: an efficient transformer language model for biomedical text mining

Pretrained language models such as Bidirectional Encoder Representations...
research
10/14/2019

Training Compact Models for Low Resource Entity Tagging using Pre-trained Language Models

Training models on low-resource named entity recognition tasks has been ...
research
03/08/2022

A Unified Framework of Medical Information Annotation and Extraction for Chinese Clinical Text

Medical information extraction consists of a group of natural language p...
research
08/10/2023

LASIGE and UNICAGE solution to the NASA LitCoin NLP Competition

Biomedical Natural Language Processing (NLP) tends to become cumbersome ...

Please sign up or login with your details

Forgot password? Click here to reset