Latent Domain Learning with Dynamic Residual Adapters

06/01/2020
by   Lucas Deecke, et al.
0

A practical shortcoming of deep neural networks is their specialization to a single task and domain. While recent techniques in domain adaptation and multi-domain learning enable the learning of more domain-agnostic features, their success relies on the presence of domain labels, typically requiring manual annotation and careful curation of datasets. Here we focus on a less explored, but more realistic case: learning from data from multiple domains, without access to domain annotations. In this scenario, standard model training leads to the overfitting of large domains, while disregarding smaller ones. We address this limitation via dynamic residual adapters, an adaptive gating mechanism that helps account for latent domains, coupled with an augmentation strategy inspired by recent style transfer techniques. Our proposed approach is examined on image classification tasks containing multiple latent domains, and we showcase its ability to obtain robust performance across these. Dynamic residual adapters significantly outperform off-the-shelf networks with much larger capacity, and can be incorporated seamlessly with existing architectures in an end-to-end manner.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/30/2018

DART: Domain-Adversarial Residual-Transfer Networks for Unsupervised Cross-Domain Image Classification

The accuracy of deep learning (e.g., convolutional neural networks) for ...
research
11/18/2019

Domain Generalization Using a Mixture of Multiple Latent Domains

When domains, which represent underlying data distributions, vary during...
research
08/25/2019

Domain Adaptive Text Style Transfer

Text style transfer without parallel data has achieved some practical su...
research
12/25/2017

Disentangled Representation Learning for Domain Adaptation and Style Transfer

In order to solve unsupervised domain adaptation problem, recent methods...
research
11/16/2017

Less-forgetful Learning for Domain Expansion in Deep Neural Networks

Expanding the domain that deep neural network has already learned withou...
research
07/17/2023

Multi-Domain Learning with Modulation Adapters

Deep convolutional networks are ubiquitous in computer vision, due to th...
research
07/23/2021

Compositional Models: Multi-Task Learning and Knowledge Transfer with Modular Networks

Conditional computation and modular networks have been recently proposed...

Please sign up or login with your details

Forgot password? Click here to reset