DeepAI
Log In Sign Up

Hierarchical Transformer Encoders for Vietnamese Spelling Correction

05/28/2021
by   Hieu Tran, et al.
0

In this paper, we propose a Hierarchical Transformer model for Vietnamese spelling correction problem. The model consists of multiple Transformer encoders and utilizes both character-level and word-level to detect errors and make corrections. In addition, to facilitate future work in Vietnamese spelling correction tasks, we propose a realistic dataset collected from real-life texts for the problem. We compare our method with other methods and publicly available systems. The proposed method outperforms all of the contemporary methods in terms of recall, precision, and f1-score. A demo version is publicly available.

READ FULL TEXT
09/29/2021

Hierarchical Character Tagger for Short Text Spelling Error Correction

State-of-the-art approaches to spelling error correction problem include...
05/11/2020

Hierarchical Attention Transformer Architecture For Syntactic Spell Correction

The attention mechanisms are playing a boosting role in advancements in ...
03/24/2022

Ensembling and Knowledge Distilling of Large Sequence Taggers for Grammatical Error Correction

In this paper, we investigate improvements to the GEC sequence tagging a...
01/25/2023

Tighter Bounds on the Expressivity of Transformer Encoders

Characterizing neural networks in terms of better-understood formal syst...
08/20/2022

BSpell: A CNN-blended BERT Based Bengali Spell Checker

Bengali typing is mostly performed using English keyboard and can be hig...
06/04/2019

RTHN: A RNN-Transformer Hierarchical Network for Emotion Cause Extraction

The emotion cause extraction (ECE) task aims at discovering the potentia...
10/21/2021

An EMD-based Method for the Detection of Power Transformer Faults with a Hierarchical Ensemble Classifier

In this paper, an Empirical Mode Decomposition-based method is proposed ...