DeepAI AI Chat
Log In Sign Up

Subword-Delimited Downsampling for Better Character-Level Translation

12/02/2022
by   Lukas Edman, et al.
University of Groningen
0

Subword-level models have been the dominant paradigm in NLP. However, character-level models have the benefit of seeing each character individually, providing the model with more detailed information that ultimately could lead to better models. Recent works have shown character-level models to be competitive with subword models, but costly in terms of time and computation. Character-level models with a downsampling component alleviate this, but at the cost of quality, particularly for machine translation. This work analyzes the problems of previous downsampling methods and introduces a novel downsampling method which is informed by subwords. This new downsampling method not only outperforms existing downsampling methods, showing that downsampling characters can be done without sacrificing quality, but also leads to promising performance compared to subword models for translation.

READ FULL TEXT
10/15/2021

Why don't people use character-level machine translation?

We present a literature and empirical survey that critically assesses th...
04/29/2020

Towards Character-Level Transformer NMT by Finetuning Subword Systems

Applying the Transformer architecture on the character level usually req...
04/30/2020

Character-Level Translation with Self-attention

We explore the suitability of self-attention models for character-level ...
08/21/2020

Neural Machine Translation without Embeddings

Many NLP models follow the embed-contextualize-predict paradigm, in whic...
09/10/2020

On Target Segmentation for Direct Speech Translation

Recent studies on direct speech translation show continuous improvements...
12/22/2020

MailLeak: Obfuscation-Robust Character Extraction Using Transfer Learning

The following work presents a new algorithm for character recognition fr...

Code Repositories

SDD

Subword-Delimited Downsampling


view repo