Weakly Supervised Grammatical Error Correction using Iterative Decoding

10/31/2018
by   Jared Lichtarge, et al.
0

We describe an approach to Grammatical Error Correction (GEC) that is effective at making use of models trained on large amounts of weakly supervised bitext. We train the Transformer sequence-to-sequence model on 4B tokens of Wikipedia revisions and employ an iterative decoding strategy that is tailored to the loosely-supervised nature of the Wikipedia training corpus. Finetuning on the Lang-8 corpus and ensembling yields an F0.5 of 58.3 on the CoNLL'14 benchmark and a GLEU of 62.4 on JFLEG. The combination of weakly supervised training and iterative decoding obtains an F0.5 of 48.2 on CoNLL'14 even without using any labeled GEC data.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/10/2019

Corpora Generation for Grammatical Error Correction

Grammatical Error Correction (GEC) has been recently modeled using the s...
research
09/29/2019

Controllable Data Synthesis Method for Grammatical Error Correction

Due to the lack of parallel data in current Grammatical Error Correction...
research
09/12/2023

Minimum Bayes' Risk Decoding for System Combination of Grammatical Error Correction Systems

For sequence-to-sequence tasks it is challenging to combine individual s...
research
10/07/2019

Parallel Iterative Edit Models for Local Sequence Transduction

We present a Parallel Iterative Edit (PIE) model for the problem of loca...
research
11/22/2022

Converge to the Truth: Factual Error Correction via Iterative Constrained Editing

Given a possibly false claim sentence, how can we automatically correct ...
research
05/29/2021

Grammatical Error Correction as GAN-like Sequence Labeling

In Grammatical Error Correction (GEC), sequence labeling models enjoy fa...
research
07/06/2019

Learning Neural Sequence-to-Sequence Models from Weak Feedback with Bipolar Ramp Loss

In many machine learning scenarios, supervision by gold labels is not av...

Please sign up or login with your details

Forgot password? Click here to reset