Label Efficient Learning of Transferable Representations across Domains and Tasks

11/30/2017
by   Zelun Luo, et al.
0

We propose a framework that learns a representation transferable across different domains and tasks in a label efficient manner. Our approach battles domain shift with a domain adversarial loss, and generalizes the embedding to novel task using a metric learning-based approach. Our model is simultaneously optimized on labeled source data and unlabeled or sparsely labeled data in the target domain. Our method shows compelling results on novel classes within a new domain even when only a few labeled examples per class are available, outperforming the prevalent fine-tuning approach. In addition, we demonstrate the effectiveness of our framework on the transfer learning task from image object recognition to video action recognition.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/31/2017

Transfer Learning with Label Noise

Transfer learning aims to improve learning in the target domain with lim...
research
10/08/2015

Simultaneous Deep Transfer Across Domains and Tasks

Recent reports suggest that a generic supervised deep CNN model trained ...
research
11/09/2017

Multi-Relevance Transfer Learning

Transfer learning aims to faciliate learning tasks in a label-scarce tar...
research
07/18/2020

Learning from Extrinsic and Intrinsic Supervisions for Domain Generalization

The generalization capability of neural networks across domains is cruci...
research
02/09/2020

GradMix: Multi-source Transfer across Domains and Tasks

The computer vision community is witnessing an unprecedented rate of new...
research
04/08/2019

Heterogeneous Multi-task Metric Learning across Multiple Domains

Distance metric learning (DML) plays a crucial role in diverse machine l...
research
10/12/2017

Self-Taught Support Vector Machine

In this paper, a new approach for classification of target task using li...

Please sign up or login with your details

Forgot password? Click here to reset