The Importance of Context in Very Low Resource Language Modeling

05/10/2022
by   Lukas Edman, et al.
0

This paper investigates very low resource language model pretraining, when less than 100 thousand sentences are available. We find that, in very low resource scenarios, statistical n-gram language models outperform state-of-the-art neural models. Our experiments show that this is mainly due to the focus of the former on a local context. As such, we introduce three methods to improve a neural model's performance in the low-resource setting, finding that limiting the model's self-attention is the most effective one, improving on downstream tasks such as NLI and POS tagging by up to 5 we test on: English, Hindi, and Turkish.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset