Chinese grammatical error correction based on knowledge distillation

07/31/2022
by   Peng Xia, et al.
0

In view of the poor robustness of existing Chinese grammatical error correction models on attack test sets and large model parameters, this paper uses the method of knowledge distillation to compress model parameters and improve the anti-attack ability of the model. In terms of data, the attack test set is constructed by integrating the disturbance into the standard evaluation data set, and the model robustness is evaluated by the attack test set. The experimental results show that the distilled small model can ensure the performance and improve the training speed under the condition of reducing the number of model parameters, and achieve the optimal effect on the attack test set, and the robustness is significantly improved.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/02/2019

An Empirical Study of Incorporating Pseudo Data into Grammatical Error Correction

The incorporation of pseudo data in the training of grammatical error co...
research
09/19/2020

Weight Distillation: Transferring the Knowledge in Neural Network Parameters

Knowledge distillation has been proven to be effective in model accelera...
research
07/26/2023

GrammarGPT: Exploring Open-Source LLMs for Native Chinese Grammatical Error Correction with Supervised Fine-Tuning

Grammatical error correction aims to correct ungrammatical sentences aut...
research
04/26/2023

Robust image steganography against lossy JPEG compression based on embedding domain selection and adaptive error correction

Transmitting images for communication on social networks has become rout...
research
02/08/2018

Imitation networks: Few-shot learning of neural networks from scratch

In this paper, we propose imitation networks, a simple but effective met...
research
10/05/2022

Meta-Ensemble Parameter Learning

Ensemble of machine learning models yields improved performance as well ...
research
07/25/2023

A Comprehensive Evaluation and Analysis Study for Chinese Spelling Check

With the development of pre-trained models and the incorporation of phon...

Please sign up or login with your details

Forgot password? Click here to reset