Training Dynamics for Curriculum Learning: A Study on Monolingual and Cross-lingual NLU

10/22/2022
by   Fenia Christopoulou, et al.
4

Curriculum Learning (CL) is a technique of training models via ranking examples in a typically increasing difficulty trend with the aim of accelerating convergence and improving generalisability. Current approaches for Natural Language Understanding (NLU) tasks use CL to improve in-distribution data performance often via heuristic-oriented or task-agnostic difficulties. In this work, instead, we employ CL for NLU by taking advantage of training dynamics as difficulty metrics, i.e., statistics that measure the behavior of the model at hand on specific task-data instances during training and propose modifications of existing CL schedulers based on these statistics. Differently from existing works, we focus on evaluating models on in-distribution (ID), out-of-distribution (OOD) as well as zero-shot (ZS) cross-lingual transfer datasets. We show across several NLU tasks that CL with training dynamics can result in better performance mostly on zero-shot cross-lingual transfer and OOD settings with improvements up by 8.5 indicate that training dynamics can lead to better performing models with smoother training compared to other difficulty metrics while being 20 on average. In addition, through analysis we shed light on the correlations of task-specific versus task-agnostic metrics.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/15/2023

Is Prompt-Based Finetuning Always Better than Vanilla Finetuning? Insights from Cross-Lingual Language Understanding

Multilingual pretrained language models (MPLMs) have demonstrated substa...
research
05/26/2020

English Intermediate-Task Training Improves Zero-Shot Cross-Lingual Transfer Too

Intermediate-task training has been shown to substantially improve pretr...
research
02/24/2021

Task-Specific Pre-Training and Cross Lingual Transfer for Code-Switched Data

Using task-specific pre-training and leveraging cross-lingual transfer a...
research
09/23/2020

Worst-Case-Aware Curriculum Learning for Zero and Few Shot Transfer

Multi-task transfer learning based on pre-trained language encoders achi...
research
10/10/2020

Zero-Shot Translation Quality Estimation with Explicit Cross-Lingual Patterns

This paper describes our submission of the WMT 2020 Shared Task on Sente...
research
03/08/2021

Meta-Learning with MAML on Trees

In meta-learning, the knowledge learned from previous tasks is transferr...
research
05/23/2022

The Importance of Being Parameters: An Intra-Distillation Method for Serious Gains

Recent model pruning methods have demonstrated the ability to remove red...

Please sign up or login with your details

Forgot password? Click here to reset