DiaNet: BERT and Hierarchical Attention Multi-Task Learning of Fine-Grained Dialect

10/31/2019 ∙ by Muhammad Abdul-Mageed, et al. ∙ 0

Prediction of language varieties and dialects is an important language processing task, with a wide range of applications. For Arabic, the native tongue of   300 million people, most varieties remain unsupported. To ease this bottleneck, we present a very large scale dataset covering 319 cities from all 21 Arab countries. We introduce a hierarchical attention multi-task learning (HA-MTL) approach for dialect identification exploiting our data at the city, state, and country levels. We also evaluate use of BERT on the three tasks, comparing it to the MTL approach. We benchmark and release our data and models.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.