Towards Minimal Supervision BERT-based Grammar Error Correction

01/10/2020
by   Yiyuan Li, et al.
17

Current grammatical error correction (GEC) models typically consider the task as sequence generation, which requires large amounts of annotated data and limit the applications in data-limited settings. We try to incorporate contextual information from pre-trained language model to leverage annotation and benefit multilingual scenarios. Results show strong potential of Bidirectional Encoder Representations from Transformers (BERT) in grammatical error correction task.

READ FULL TEXT

page 1

page 2

page 3

research
05/03/2020

Encoder-Decoder Models Can Benefit from Pre-trained Masked Language Models in Grammatical Error Correction

This paper investigates how to effectively incorporate a pre-trained mas...
research
09/29/2021

Hierarchical Character Tagger for Short Text Spelling Error Correction

State-of-the-art approaches to spelling error correction problem include...
research
05/15/2020

Spelling Error Correction with Soft-Masked BERT

Spelling error correction is an important yet challenging task because a...
research
06/04/2020

Personalizing Grammatical Error Correction: Adaptation to Proficiency Level and L1

Grammar error correction (GEC) systems have become ubiquitous in a varie...
research
09/14/2021

LM-Critic: Language Models for Unsupervised Grammatical Error Correction

Training a model for grammatical error correction (GEC) requires a set o...
research
05/22/2023

Bidirectional Transformer Reranker for Grammatical Error Correction

Pre-trained seq2seq models have achieved state-of-the-art results in the...
research
06/06/2021

Do Grammatical Error Correction Models Realize Grammatical Generalization?

There has been an increased interest in data generation approaches to gr...

Please sign up or login with your details

Forgot password? Click here to reset