Towards All-around Knowledge Transferring: Learning From Task-irrelevant Labels

11/17/2020
by   Yinghui Li, et al.
0

Deep neural models have hitherto achieved significant performances on numerous classification tasks, but meanwhile require sufficient manually annotated data. Since it is extremely time-consuming and expensive to annotate adequate data for each classification task, learning an empirically effective model with generalization on small dataset has received increased attention. Existing efforts mainly focus on transferring task-relevant knowledge from other similar data to tackle the issue. These approaches have yielded remarkable improvements, yet neglecting the fact that the task-irrelevant features could bring out massive negative transfer effects. To date, no large-scale studies have been performed to investigate the impact of task-irrelevant features, let alone the utilization of this kind of features. In this paper, we firstly propose Task-Irrelevant Transfer Learning (TIRTL) to exploit task-irrelevant features, which mainly are extracted from task-irrelevant labels. Particularly, we suppress the expression of task-irrelevant information and facilitate the learning process of classification. We also provide a theoretical explanation of our method. In addition, TIRTL does not conflict with those that have previously exploited task-relevant knowledge and can be well combined to enable the simultaneous utilization of task-relevant and task-irrelevant features for the first time. In order to verify the effectiveness of our theory and method, we conduct extensive experiments on facial expression recognition and digit recognition tasks. Our source code will be also available in the future for reproducibility.

READ FULL TEXT

page 12

page 13

page 14

research
07/09/2020

Deep Multi-task Learning for Facial Expression Recognition and Synthesis Based on Selective Feature Sharing

Multi-task learning is an effective learning strategy for deep-learning-...
research
09/08/2022

Cross-Modal Knowledge Transfer Without Task-Relevant Source Data

Cost-effective depth and infrared sensors as alternatives to usual RGB s...
research
06/08/2023

Generalization Performance of Transfer Learning: Overparameterized and Underparameterized Regimes

Transfer learning is a useful technique for achieving improved performan...
research
06/10/2022

NR-DFERNet: Noise-Robust Network for Dynamic Facial Expression Recognition

Dynamic facial expression recognition (DFER) in the wild is an extremely...
research
03/22/2017

Joint Intermodal and Intramodal Label Transfers for Extremely Rare or Unseen Classes

In this paper, we present a label transfer model from texts to images fo...
research
07/24/2021

Semantic-guided Pixel Sampling for Cloth-Changing Person Re-identification

Cloth-changing person re-identification (re-ID) is a new rising research...
research
05/08/2020

Relatedness Measures to Aid the Transfer of Building Blocks among Multiple Tasks

Multitask Learning is a learning paradigm that deals with multiple diffe...

Please sign up or login with your details

Forgot password? Click here to reset