A Small and Fast BERT for Chinese Medical Punctuation Restoration

08/24/2023
by   Tongtao Ling, et al.
0

In clinical dictation, utterances after automatic speech recognition (ASR) without explicit punctuation marks may lead to the misunderstanding of dictated reports. To give a precise and understandable clinical report with ASR, automatic punctuation restoration is required. Considering a practical scenario, we propose a fast and light pre-trained model for Chinese medical punctuation restoration based on 'pretraining and fine-tuning' paradigm. In this work, we distill pre-trained models by incorporating supervised contrastive learning and a novel auxiliary pre-training task (Punctuation Mark Prediction) to make it well-suited for punctuation restoration. Our experiments on various distilled models reveal that our model can achieve 95 while 10

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/26/2022

Efficient Use of Large Pre-Trained Models for Low Resource ASR

Automatic speech recognition (ASR) has been established as a well-perfor...
research
07/24/2023

Boosting Punctuation Restoration with Data Generation and Reinforcement Learning

Punctuation restoration is an important task in automatic speech recogni...
research
05/11/2020

Incremental Learning for End-to-End Automatic Speech Recognition

We propose an incremental learning for end-to-end Automatic Speech Recog...
research
01/18/2021

Automatic punctuation restoration with BERT models

We present an approach for automatic punctuation restoration with BERT m...
research
10/28/2021

RadBERT-CL: Factually-Aware Contrastive Learning For Radiology Report Classification

Radiology reports are unstructured and contain the imaging findings and ...
research
05/25/2023

INTapt: Information-Theoretic Adversarial Prompt Tuning for Enhanced Non-Native Speech Recognition

Automatic Speech Recognition (ASR) systems have attained unprecedented p...
research
11/21/2022

TCBERT: A Technical Report for Chinese Topic Classification BERT

Bidirectional Encoder Representations from Transformers or BERT <cit.> h...

Please sign up or login with your details

Forgot password? Click here to reset