DeepAI
Log In Sign Up

Persian Ezafe Recognition Using Transformers and Its Role in Part-Of-Speech Tagging

09/20/2020
by   Ehsan Doostmohammadi, et al.
0

Ezafe is a grammatical particle in some Iranian languages that links two words together. Regardless of the important information it conveys, it is almost always not indicated in Persian script, resulting in mistakes in reading complex sentences and errors in natural language processing tasks. In this paper, we experiment with different machine learning methods to achieve state-of-the-art results in the task of ezafe recognition. Transformer-based methods, BERT and XLMRoBERTa, achieve the best results, the latter achieving 2.68 information to improve Persian part-of-speech tagging results and show that such information will not be useful to transformer-based methods and explain why that might be the case.

READ FULL TEXT

page 1

page 2

page 3

page 4

04/06/2022

Yunshan Cup 2020: Overview of the Part-of-Speech Tagging Task for Low-resourced Languages

The Yunshan Cup 2020 track focused on creating a framework for evaluatin...
10/29/2022

XNOR-FORMER: Learning Accurate Approximations in Long Speech Transformers

Transformers are among the state of the art for many tasks in speech, vi...
06/09/2021

Auto-tagging of Short Conversational Sentences using Natural Language Processing Methods

In this study, we aim to find a method to auto-tag sentences specific to...
06/03/2021

Auto-tagging of Short Conversational Sentences using Transformer Methods

The problem of categorizing short speech sentences according to their se...
02/08/2022

Particle Transformer for Jet Tagging

Jet tagging is a critical yet challenging classification task in particl...
01/10/2018

Unsupervised Part-of-Speech Induction

Part-of-Speech (POS) tagging is an old and fundamental task in natural l...
05/19/2021

Laughing Heads: Can Transformers Detect What Makes a Sentence Funny?

The automatic detection of humor poses a grand challenge for natural lan...