BERT for Long Documents: A Case Study of Automated ICD Coding

11/04/2022
by   Arash Afkanpour, et al.
0

Transformer models have achieved great success across many NLP problems. However, previous studies in automated ICD coding concluded that these models fail to outperform some of the earlier solutions such as CNN-based models. In this paper we challenge this conclusion. We present a simple and scalable method to process long text with the existing transformer models such as BERT. We show that this method significantly improves the previous results reported for transformer models in ICD coding, and is able to outperform one of the prominent CNN-based methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/26/2021

Evaluating Transformer based Semantic Segmentation Networks for Pathological Image Segmentation

Histopathology has played an essential role in cancer diagnosis. With th...
research
04/14/2021

Towards BERT-based Automatic ICD Coding: Limitations and Opportunities

Automatic ICD coding is the task of assigning codes from the Internation...
research
10/03/2022

Introducing Vision Transformer for Alzheimer's Disease classification task with 3D input

Many high-performance classification models utilize complex CNN-based ar...
research
12/12/2022

Automated ICD Coding using Extreme Multi-label Long Text Transformer-based Models

Background: Encouraged by the success of pretrained Transformer models i...
research
02/07/2023

Transformer-based Models for Long-Form Document Matching: Challenges and Empirical Analysis

Recent advances in the area of long document matching have primarily foc...
research
09/26/2020

Techniques to Improve Q A Accuracy with Transformer-based models on Large Complex Documents

This paper discusses the effectiveness of various text processing techni...
research
02/12/2023

Transformer models: an introduction and catalog

In the past few years we have seen the meteoric appearance of dozens of ...

Please sign up or login with your details

Forgot password? Click here to reset