Distribution Matching for Heterogeneous Multi-Task Learning: a Large-scale Face Study

05/08/2021
by   Dimitrios Kollias, et al.
0

Multi-Task Learning has emerged as a methodology in which multiple tasks are jointly learned by a shared learning algorithm, such as a DNN. MTL is based on the assumption that the tasks under consideration are related; therefore it exploits shared knowledge for improving performance on each individual task. Tasks are generally considered to be homogeneous, i.e., to refer to the same type of problem. Moreover, MTL is usually based on ground truth annotations with full, or partial overlap across tasks. In this work, we deal with heterogeneous MTL, simultaneously addressing detection, classification regression problems. We explore task-relatedness as a means for co-training, in a weakly-supervised way, tasks that contain little, or even non-overlapping annotations. Task-relatedness is introduced in MTL, either explicitly through prior expert knowledge, or through data-driven studies. We propose a novel distribution matching approach, in which knowledge exchange is enabled between tasks, via matching of their predictions' distributions. Based on this approach, we build FaceBehaviorNet, the first framework for large-scale face analysis, by jointly learning all facial behavior tasks. We develop case studies for: i) continuous affect estimation, action unit detection, basic emotion recognition; ii) attribute detection, face identification. We illustrate that co-training via task relatedness alleviates negative transfer. Since FaceBehaviorNet learns features that encapsulate all aspects of facial behavior, we conduct zero-/few-shot learning to perform tasks beyond the ones that it has been trained for, such as compound emotion recognition. By conducting a very large experimental study, utilizing 10 databases, we illustrate that our approach outperforms, by large margins, the state-of-the-art in all tasks and in all databases, even in these which have not been used in its training.

READ FULL TEXT
research
10/15/2019

Face Behavior à la carte: Expressions, Affect and Action Units in a Single Network

Automatic facial behavior analysis has a long history of studies in the ...
research
10/28/2021

Facial Emotion Recognition: A multi-task approach using deep learning

Facial Emotion Recognition is an inherently difficult problem, due to va...
research
09/25/2019

Expression, Affect, Action Unit Recognition: Aff-Wild2, Multi-Task Learning and ArcFace

Affective computing has been largely limited in terms of available data ...
research
03/29/2021

Affect Analysis in-the-wild: Valence-Arousal, Expressions, Action Units and a Unified Framework

Affect recognition based on subjects' facial expressions has been a topi...
research
11/18/2019

Multiple Face Analyses through Adversarial Learning

This inherent relations among multiple face analysis tasks, such as land...
research
04/23/2021

Weakly-supervised Multi-task Learning for Multimodal Affect Recognition

Multimodal affect recognition constitutes an important aspect for enhanc...
research
06/22/2022

Deep Learning to Jointly Schema Match, Impute, and Transform Databases

An applied problem facing all areas of data science is harmonizing data ...

Please sign up or login with your details

Forgot password? Click here to reset