On the Choice of Auxiliary Languages for Improved Sequence Tagging

05/19/2020
by   Lukas Lange, et al.
0

Recent work showed that embeddings from related languages can improve the performance of sequence tagging, even for monolingual models. In this analysis paper, we investigate whether the best auxiliary language can be predicted based on language distances and show that the most related language is not always the best auxiliary language. Further, we show that attention-based meta-embeddings can effectively combine pre-trained embeddings from different languages for sequence tagging and set new state-of-the-art results for part-of-speech tagging in five languages.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset