MultiMatch: Multi-task Learning for Semi-supervised Domain Generalization

08/11/2022
by   Lei Qi, et al.
0

Domain generalization (DG) aims at learning a model on source domains to well generalize on the unseen target domain. Although it has achieved great success, most of existing methods require the label information for all training samples in source domains, which is time-consuming and expensive in the real-world application. In this paper, we resort to solving the semi-supervised domain generalization (SSDG) task, where there are a few label information in each source domain. To address the task, we first analyze the theory of the multi-domain learning, which highlights that 1) mitigating the impact of domain gap and 2) exploiting all samples to train the model can effectively reduce the generalization error in each source domain so as to improve the quality of pseudo-labels. According to the analysis, we propose MultiMatch, i.e., extending FixMatch to the multi-task learning framework, producing the high-quality pseudo-label for SSDG. To be specific, we consider each training domain as a single task (i.e., local task) and combine all training domains together (i.e., global task) to train an extra task for the unseen test domain. In the multi-task framework, we utilize the independent BN and classifier for each task, which can effectively alleviate the interference from different domains during pseudo-labeling. Also, most of parameters in the framework are shared, which can be trained by all training samples sufficiently. Moreover, to further boost the pseudo-label accuracy and the model's generalization, we fuse the predictions from the global task and local task during training and testing, respectively. A series of experiments validate the effectiveness of the proposed method, and it outperforms the existing semi-supervised methods and the SSDG method on several benchmark DG datasets.

READ FULL TEXT

page 1

page 4

page 5

research
09/26/2020

Domain Generalization via Semi-supervised Meta Learning

The goal of domain generalization is to learn from multiple source domai...
research
03/15/2020

Beyond without Forgetting: Multi-Task Learning for Classification with Disjoint Datasets

Multi-task Learning (MTL) for classification with disjoint datasets aims...
research
08/07/2022

Label-Efficient Domain Generalization via Collaborative Exploration and Generalization

Considerable progress has been made in domain generalization (DG) which ...
research
08/22/2022

Detect Hate Speech in Unseen Domains using Multi-Task Learning: A Case Study of Political Public Figures

Automatic identification of hateful and abusive content is vital in comb...
research
11/19/2021

Semi-Supervised Domain Generalization in Real World:New Benchmark and Strong Baseline

Conventional domain generalization aims to learn domain invariant repres...
research
02/12/2022

A multi-task semi-supervised framework for Text2Graph Graph2Text

The Artificial Intelligence industry regularly develops applications tha...
research
03/16/2020

Interpretable MTL from Heterogeneous Domains using Boosted Tree

Multi-task learning (MTL) aims at improving the generalization performan...

Please sign up or login with your details

Forgot password? Click here to reset