Disentanglement by Cyclic Reconstruction

12/24/2021
by   David Bertoin, et al.
13

Deep neural networks have demonstrated their ability to automatically extract meaningful features from data. However, in supervised learning, information specific to the dataset used for training, but irrelevant to the task at hand, may remain encoded in the extracted representations. This remaining information introduces a domain-specific bias, weakening the generalization performance. In this work, we propose splitting the information into a task-related representation and its complementary context representation. We propose an original method, combining adversarial feature predictors and cyclic reconstruction, to disentangle these two representations in the single-domain supervised case. We then adapt this method to the unsupervised domain adaptation problem, consisting of training a model capable of performing on both a source and a target domain. In particular, our method promotes disentanglement in the target domain, despite the absence of training labels. This enables the isolation of task-specific information from both domains and a projection into a common representation. The task-specific representation allows efficient transfer of knowledge acquired from the source domain to the target domain. In the single-domain case, we demonstrate the quality of our representations on information retrieval tasks and the generalization benefits induced by sharpened task-specific representations. We then validate the proposed method on several classical domain adaptation benchmarks and illustrate the benefits of disentanglement for domain adaptation.

READ FULL TEXT

page 6

page 7

page 9

page 13

research
10/28/2019

Deep causal representation learning for unsupervised domain adaptation

Studies show that the representations learned by deep neural networks ca...
research
04/20/2022

Deep transfer learning for partial differential equations under conditional shift with DeepONet

Traditional machine learning algorithms are designed to learn in isolati...
research
07/01/2018

Augmented Cyclic Adversarial Learning for Domain Adaptation

Training a model to perform a task typically requires a large amount of ...
research
09/13/2021

Variational Disentanglement for Domain Generalization

Domain generalization aims to learn an invariant model that can generali...
research
01/03/2023

Heterogeneous Domain Adaptation and Equipment Matching: DANN-based Alignment with Cyclic Supervision (DBACS)

Process monitoring and control are essential in modern industries for en...
research
07/27/2020

Learning Task-oriented Disentangled Representations for Unsupervised Domain Adaptation

Unsupervised domain adaptation (UDA) aims to address the domain-shift pr...
research
06/15/2023

Building blocks for complex tasks: Robust generative event extraction for radiology reports under domain shifts

This paper explores methods for extracting information from radiology re...

Please sign up or login with your details

Forgot password? Click here to reset