Mitigating Negative Transfer with Task Awareness for Sexism, Hate Speech, and Toxic Language Detection

This paper proposes a novelty approach to mitigate the negative transfer problem. In the field of machine learning, the common strategy is to apply the Single-Task Learning approach in order to train a supervised model to solve a specific task. Training a robust model requires a lot of data and a significant amount of computational resources, making this solution unfeasible in cases where data are unavailable or expensive to gather. Therefore another solution, based on the sharing of information between tasks, has been developed: Multi-Task Learning (MTL). Despite the recent developments regarding MTL, the problem of negative transfer has still to be solved. Negative transfer is a phenomenon that occurs when noisy information is shared between tasks, resulting in a drop in performance. This paper proposes a new approach to mitigate the negative transfer problem based on the task awareness concept. The proposed approach results in diminishing the negative transfer together with an improvement of performance over classic MTL solution. Moreover, the proposed approach has been implemented in two unified architectures to detect Sexism, Hate Speech, and Toxic Language in text comments. The proposed architectures set a new state-of-the-art both in EXIST-2021 and HatEval-2019 benchmarks.

READ FULL TEXT
research
12/13/2022

Do Text-to-Text Multi-Task Learners Suffer from Task Conflict?

Traditional multi-task learning architectures train a single model acros...
research
11/22/2022

Mitigating Negative Transfer in Multi-Task Learning with Exponential Moving Average Loss Weighting Strategies

Multi-Task Learning (MTL) is a growing subject of interest in deep learn...
research
03/03/2021

Rotograd: Dynamic Gradient Homogenization for Multi-Task Learning

While multi-task learning (MTL) has been successfully applied in several...
research
08/05/2020

Learning Boost by Exploiting the Auxiliary Task in Multi-task Domain

Learning two tasks in a single shared function has some benefits. Firstl...
research
11/01/2021

PCA-based Multi Task Learning: a Random Matrix Approach

The article proposes and theoretically analyses a computationally effici...
research
12/06/2021

Incentive Compatible Pareto Alignment for Multi-Source Large Graphs

In this paper, we focus on learning effective entity matching models ove...
research
11/24/2018

Characterizing and Avoiding Negative Transfer

When labeled data is scarce for a specific target task, transfer learnin...

Please sign up or login with your details

Forgot password? Click here to reset