Context is Key: Grammatical Error Detection with Contextual Word Representations

06/15/2019
by   Samuel Bell, et al.
0

Grammatical error detection (GED) in non-native writing requires systems to identify a wide range of errors in text written by language learners. Error detection as a purely supervised task can be challenging, as GED datasets are limited in size and the label distributions are highly imbalanced. Contextualized word representations offer a possible solution, as they can efficiently capture compositional information in language and can be optimized on large amounts of unsupervised data. In this paper, we perform a systematic comparison of ELMo, BERT and Flair embeddings (Peters et al., 2017; Devlin et al., 2018; Akbik et al., 2018) on a range of public GED datasets, and propose an approach to effectively integrate such representations in current methods, achieving a new state of the art on GED. We further analyze the strengths and weaknesses of different contextual embeddings for the task at hand, and present detailed analyses of their impact on different types of errors.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/15/2019

What do you learn from context? Probing for sentence structure in contextualized word representations

Contextualized representation models such as ELMo (Peters et al., 2018a)...
research
04/06/2016

An Ensemble Method to Produce High-Quality Word Embeddings

A currently successful approach to computational semantics is to represe...
research
03/16/2020

A Survey on Contextual Embeddings

Contextual embeddings, such as ELMo and BERT, move beyond global word re...
research
04/03/2019

Multi-task Learning for Chinese Word Usage Errors Detection

Chinese word usage errors often occur in non-native Chinese learners' wr...
research
03/19/2021

MuRIL: Multilingual Representations for Indian Languages

India is a multilingual society with 1369 rationalized languages and dia...
research
10/22/2019

Grammatical Gender, Neo-Whorfianism, and Word Embeddings: A Data-Driven Approach to Linguistic Relativity

The relation between language and thought has occupied linguists for at ...
research
07/20/2016

Compositional Sequence Labeling Models for Error Detection in Learner Writing

In this paper, we present the first experiments using neural network mod...

Please sign up or login with your details

Forgot password? Click here to reset