Bidirectional Transformer Reranker for Grammatical Error Correction

05/22/2023
by   Ying Zhang, et al.
0

Pre-trained seq2seq models have achieved state-of-the-art results in the grammatical error correction task. However, these models still suffer from a prediction bias due to their unidirectional decoding. Thus, we propose a bidirectional Transformer reranker (BTR), that re-estimates the probability of each candidate sentence generated by the pre-trained seq2seq model. The BTR preserves the seq2seq-style Transformer architecture but utilizes a BERT-style self-attention mechanism in the decoder to compute the probability of each target token by using masked language modeling to capture bidirectional representations from the target context. For guiding the reranking, the BTR adopts negative sampling in the objective function to minimize the unlikelihood. During inference, the BTR gives final results after comparing the reranked top-1 results with the original ones by an acceptance threshold. Experimental results show that, in reranking candidates from a pre-trained seq2seq model, T5-base, the BTR on top of T5-base could yield 65.47 and 71.27 F0.5 scores on the CoNLL-14 and BEA test sets, respectively, and yield 59.52 GLEU score on the JFLEG corpus, with improvements of 0.36, 0.76 and 0.48 points compared with the original T5-base. Furthermore, when reranking candidates from T5-large, the BTR on top of T5-base improved the original T5-large by 0.26 points on the BEA test set.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/29/2021

Hierarchical Character Tagger for Short Text Spelling Error Correction

State-of-the-art approaches to spelling error correction problem include...
research
05/25/2022

Transcormer: Transformer for Sentence Scoring with Sliding Language Modeling

Sentence scoring aims at measuring the likelihood score of a sentence an...
research
01/10/2020

Towards Minimal Supervision BERT-based Grammar Error Correction

Current grammatical error correction (GEC) models typically consider the...
research
03/01/2019

Improving Grammatical Error Correction via Pre-Training a Copy-Augmented Architecture with Unlabeled Data

Neural machine translation systems have become state-of-the-art approach...
research
05/26/2020

GECToR – Grammatical Error Correction: Tag, Not Rewrite

In this paper, we present a simple and efficient GEC sequence tagger usi...
research
04/07/2021

Interpreting A Pre-trained Model Is A Key For Model Architecture Optimization: A Case Study On Wav2Vec 2.0

A deep Transformer model with good evaluation score does not mean each s...
research
03/07/2021

Orthogonal Attention: A Cloze-Style Approach to Negation Scope Resolution

Negation Scope Resolution is an extensively researched problem, which is...

Please sign up or login with your details

Forgot password? Click here to reset