Network Transplanting

04/26/2018
by   Quanshi Zhang, et al.
0

This paper focuses on a novel problem, i.e., transplanting a category-and-task-specific neural network to a generic, distributed network without strong supervision. Like playing LEGO blocks, incrementally constructing a generic network by asynchronously merging specific neural networks is a crucial bottleneck for deep learning. Suppose that the pre-trained specific network contains a module f to extract features of the target category, and the generic network has a module g for a target task, which is trained using other categories except for the target category. Instead of using numerous training samples to teach the generic network a new category, we aim to learn a small adapter module to connect f and g to accomplish the task on a target category in a weakly-supervised manner. The core challenge is to efficiently learn feature projections between the two connected modules. We propose a new distillation algorithm, which exhibited superior performance. Our method without training samples even significantly outperformed the baseline with 100 training samples.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/21/2019

Network Transplanting (extended abstract)

This paper focuses on a new task, i.e., transplanting a category-and-tas...
research
06/24/2023

Weakly Supervised Multi-Label Classification of Full-Text Scientific Papers

Instead of relying on human-annotated training samples to build a classi...
research
11/19/2016

Semantic tracking: Single-target tracking with inter-supervised convolutional networks

This article presents a semantic tracker which simultaneously tracks a s...
research
02/28/2023

Generic-to-Specific Distillation of Masked Autoencoders

Large vision Transformers (ViTs) driven by self-supervised pre-training ...
research
08/05/2019

Knowledge Isomorphism between Neural Networks

This paper aims to analyze knowledge isomorphism between pre-trained dee...
research
11/30/2017

Learning to Learn from Weak Supervision by Full Supervision

In this paper, we propose a method for training neural networks when we ...
research
10/14/2021

ClonalNet: Classifying Better by Focusing on Confusing Categories

Existing neural classification networks predominately adopt one-hot enco...

Please sign up or login with your details

Forgot password? Click here to reset