Friend-training: Learning from Models of Different but Related Tasks

01/31/2023
by   Mian Zhang, et al.
0

Current self-training methods such as standard self-training, co-training, tri-training, and others often focus on improving model performance on a single task, utilizing differences in input features, model architectures, and training processes. However, many tasks in natural language processing are about different but related aspects of language, and models trained for one task can be great teachers for other related tasks. In this work, we propose friend-training, a cross-task self-training framework, where models trained to do different tasks are used in an iterative training, pseudo-labeling, and retraining process to help each other for better selection of pseudo-labels. With two dialogue understanding tasks, conversational semantic role labeling and dialogue rewriting, chosen for a case study, we show that the models trained with the friend-training framework achieve the best performance compared to strong baselines.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/19/2022

Semantic-based Pre-training for Dialogue Understanding

Pre-trained language models have made great progress on dialogue tasks. ...
research
04/17/2020

Can You Put it All Together: Evaluating Conversational Agents' Ability to Blend Skills

Being engaging, knowledgeable, and empathetic are all desirable general ...
research
06/09/2020

Learning Functions to Study the Benefit of Multitask Learning

We study and quantify the generalization patterns of multitask learning ...
research
05/20/2022

Robust Task-Oriented Dialogue Generation with Contrastive Pre-training and Adversarial Filtering

Data artifacts incentivize machine learning models to learn non-transfer...
research
05/31/2023

Assessing Word Importance Using Models Trained for Semantic Tasks

Many NLP tasks require to automatically identify the most significant wo...
research
06/22/2020

What shapes feature representations? Exploring datasets, architectures, and training

In naturalistic learning problems, a model's input contains a wide range...
research
10/06/2022

Time Will Change Things: An Empirical Study on Dynamic Language Understanding in Social Media Classification

Language features are ever-evolving in the real-world social media envir...

Please sign up or login with your details

Forgot password? Click here to reset