Geographic Adaptation of Pretrained Language Models

03/16/2022
by   Valentin Hofmann, et al.
0

Geographic linguistic features are commonly used to improve the performance of pretrained language models (PLMs) on NLP tasks where geographic knowledge is intuitively beneficial (e.g., geolocation prediction and dialect feature prediction). Existing work, however, leverages such geographic information in task-specific fine-tuning, failing to incorporate it into PLMs' geo-linguistic knowledge, which would make it transferable across different tasks. In this work, we introduce an approach to task-agnostic geoadaptation of PLMs that forces the PLM to learn associations between linguistic phenomena and geographic locations. More specifically, geoadaptation is an intermediate training step that couples masked language modeling and geolocation prediction in a dynamic multitask learning setup. In our experiments, we geoadapt BERTić – a PLM for Bosnian, Croatian, Montenegrin, and Serbian (BCMS) – using a corpus of geotagged BCMS tweets. Evaluation on three different tasks, namely unsupervised (zero-shot) and supervised geolocation prediction and (unsupervised) prediction of dialect features, shows that our geoadaptation approach is very effective: e.g., we obtain new state-of-the-art performance in supervised geolocation prediction and report massive gains over geographically uninformed PLMs on zero-shot geolocation prediction.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/01/2022

Zemi: Learning Zero-Shot Semi-Parametric Language Models from Multiple Tasks

Although large language models have achieved impressive zero-shot abilit...
research
03/09/2022

Model-Agnostic Multitask Fine-tuning for Few-shot Vision-Language Transfer Learning

Despite achieving state-of-the-art zero-shot performance, existing visio...
research
11/15/2022

Prompting Language Models for Linguistic Structure

Although pretrained language models (PLMs) can be prompted to perform a ...
research
05/31/2022

The Contribution of Lyrics and Acoustics to Collaborative Understanding of Mood

In this work, we study the association between song lyrics and mood thro...
research
08/01/2022

On the Limitations of Sociodemographic Adaptation with Transformers

Sociodemographic factors (e.g., gender or age) shape our language. Previ...
research
03/17/2022

Coherence boosting: When your pretrained language model is not paying enough attention

Long-range semantic coherence remains a challenge in automatic language ...
research
10/24/2022

An Empirical Revisiting of Linguistic Knowledge Fusion in Language Understanding Tasks

Though linguistic knowledge emerges during large-scale language model pr...

Please sign up or login with your details

Forgot password? Click here to reset