Stronger Baselines for Grammatical Error Correction Using Pretrained Encoder-Decoder Model

05/24/2020
by   Satoru Katsumata, et al.
0

Grammatical error correction (GEC) literature has reported on the effectiveness of pretraining a Seq2Seq model with a large amount of pseudo data. In this study, we explored two generic pretrained encoder-decoder (Enc-Dec) models, including BART, which reported the state-of-the-art (SOTA) results for several Seq2Seq tasks other than GEC. We found that monolingual and multilingual BART models achieve high performance in GEC, including a competitive result compared with the current SOTA result in English GEC. Our implementations will be publicly available at GitHub.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/28/2018

Graph-based Filtering of Out-of-Vocabulary Words for Encoder-Decoder Models

Encoder-decoder models typically only employ words that are frequently u...
research
06/25/2021

DeltaLM: Encoder-Decoder Pre-training for Language Generation and Translation by Augmenting Pretrained Multilingual Encoders

While pretrained encoders have achieved success in various natural langu...
research
03/29/2023

ProductAE: Toward Deep Learning Driven Error-Correction Codes of Large Dimensions

While decades of theoretical research have led to the invention of sever...
research
11/19/2017

An Improved Oscillating-Error Classifier with Branching

This paper extends the earlier work on an oscillating error correction t...
research
05/03/2020

Encoder-Decoder Models Can Benefit from Pre-trained Masked Language Models in Grammatical Error Correction

This paper investigates how to effectively incorporate a pre-trained mas...
research
05/21/2017

Spelling Correction as a Foreign Language

In this paper, we reformulated the spell correction problem as a machine...
research
01/17/2022

Proficiency Matters Quality Estimation in Grammatical Error Correction

This study investigates how supervised quality estimation (QE) models of...

Please sign up or login with your details

Forgot password? Click here to reset