DeepAI
Log In Sign Up

Chinese Grammatical Correction Using BERT-based Pre-trained Model

11/04/2020
by   Hongfei Wang, et al.
0

In recent years, pre-trained models have been extensively studied, and several downstream tasks have benefited from their utilization. In this study, we verify the effectiveness of two methods that incorporate a BERT-based pre-trained model developed by Cui et al. (2020) into an encoder-decoder model on Chinese grammatical error correction tasks. We also analyze the error type and conclude that sentence-level errors are yet to be addressed.

READ FULL TEXT

page 1

page 2

page 3

page 4

10/23/2020

Pre-trained Model for Chinese Word Segmentation with Meta Learning

Recent researches show that pre-trained models such as BERT (Devlin et a...
05/03/2020

Encoder-Decoder Models Can Benefit from Pre-trained Masked Language Models in Grammatical Error Correction

This paper investigates how to effectively incorporate a pre-trained mas...
11/09/2022

A Method to Judge the Style of Classical Poetry Based on Pre-trained Model

One of the important topics in the research field of Chinese classical p...
11/16/2021

Integrated Semantic and Phonetic Post-correction for Chinese Speech Recognition

Due to the recent advances of natural language processing, several works...
04/15/2022

Improving Pre-trained Language Models with Syntactic Dependency Prediction Task for Chinese Semantic Error Recognition

Existing Chinese text error detection mainly focuses on spelling and sim...
03/01/2019

Improving Grammatical Error Correction via Pre-Training a Copy-Augmented Architecture with Unlabeled Data

Neural machine translation systems have become state-of-the-art approach...
12/31/2020

Unified Mandarin TTS Front-end Based on Distilled BERT Model

The front-end module in a typical Mandarin text-to-speech system (TTS) i...