Mitigating Negative Transfer in Multi-Task Learning with Exponential Moving Average Loss Weighting Strategies

11/22/2022
by   Anish Lakkapragada, et al.
0

Multi-Task Learning (MTL) is a growing subject of interest in deep learning, due to its ability to train models more efficiently on multiple tasks compared to using a group of conventional single-task models. However, MTL can be impractical as certain tasks can dominate training and hurt performance in others, thus making some tasks perform better in a single-task model compared to a multi-task one. Such problems are broadly classified as negative transfer, and many prior approaches in the literature have been made to mitigate these issues. One such current approach to alleviate negative transfer is to weight each of the losses so that they are on the same scale. Whereas current loss balancing approaches rely on either optimization or complex numerical analysis, none directly scale the losses based on their observed magnitudes. We propose multiple techniques for loss balancing based on scaling by the exponential moving average and benchmark them against current best-performing methods on three established datasets. On these datasets, they achieve comparable, if not higher, performance compared to current best-performing methods.

READ FULL TEXT
research
08/23/2023

A Scale-Invariant Task Balancing Approach for Multi-Task Learning

Multi-task learning (MTL), a learning paradigm to learn multiple related...
research
11/20/2021

A Closer Look at Loss Weighting in Multi-Task Learning

Multi-Task Learning (MTL) has achieved great success in various fields, ...
research
11/28/2021

Cross-Task Consistency Learning Framework for Multi-Task Learning

Multi-task learning (MTL) is an active field in deep learning in which w...
research
07/07/2023

Mitigating Negative Transfer with Task Awareness for Sexism, Hate Speech, and Toxic Language Detection

This paper proposes a novelty approach to mitigate the negative transfer...
research
03/03/2021

Rotograd: Dynamic Gradient Homogenization for Multi-Task Learning

While multi-task learning (MTL) has been successfully applied in several...
research
09/23/2022

Do Current Multi-Task Optimization Methods in Deep Learning Even Help?

Recent research has proposed a series of specialized optimization algori...
research
05/25/2023

Modeling Task Relationships in Multi-variate Soft Sensor with Balanced Mixture-of-Experts

Accurate estimation of multiple quality variables is critical for buildi...

Please sign up or login with your details

Forgot password? Click here to reset