Data-Efficient Cross-Lingual Transfer with Language-Specific Subnetworks

10/31/2022
by   Rochelle Choenni, et al.
0

Large multilingual language models typically share their parameters across all languages, which enables cross-lingual task transfer, but learning can also be hindered when training updates from different languages are in conflict. In this paper, we propose novel methods for using language-specific subnetworks, which control cross-lingual parameter sharing, to reduce conflicts and increase positive transfer during fine-tuning. We introduce dynamic subnetworks, which are jointly updated with the model, and we combine our methods with meta-learning, an established, but complementary, technique for improving cross-lingual transfer. Finally, we provide extensive analyses of how each of our methods affects the models.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset