Multi-task Learning for Chinese Word Usage Errors Detection

04/03/2019
by   Jinbin Zhang, et al.
0

Chinese word usage errors often occur in non-native Chinese learners' writing. It is very helpful for non-native Chinese learners to detect them automatically when learning writing. In this paper, we propose a novel approach, which takes advantages of different auxiliary tasks, such as POS-tagging prediction and word log frequency prediction, to help the task of Chinese word usage error detection. With the help of these auxiliary tasks, we achieve the state-of-the-art results on the performances on the HSK corpus data, without any other extra data.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/17/2023

Is Argument Structure of Learner Chinese Understandable: A Corpus-Based Analysis

This paper presents a corpus-based analysis of argument structure errors...
research
03/21/2023

Chinese Intermediate English Learners outdid ChatGPT in deep cohesion: Evidence from English narrative writing

ChatGPT is a publicly available chatbot that can quickly generate texts ...
research
07/11/2018

Neural Chinese Word Segmentation with Dictionary Knowledge

Chinese word segmentation (CWS) is an important task for Chinese NLP. Re...
research
07/02/2018

Improving part-of-speech tagging via multi-task learning and character-level word representations

In this paper, we explore the ways to improve POS-tagging using various ...
research
03/08/2004

Demolishing Searle's Chinese Room

Searle's Chinese Room argument is refuted by showing that he has actuall...
research
06/15/2019

Context is Key: Grammatical Error Detection with Contextual Word Representations

Grammatical error detection (GED) in non-native writing requires systems...
research
10/20/2022

Improving Chinese Spelling Check by Character Pronunciation Prediction: The Effects of Adaptivity and Granularity

Chinese spelling check (CSC) is a fundamental NLP task that detects and ...

Please sign up or login with your details

Forgot password? Click here to reset