Partial Network Cloning

03/19/2023
by   Jingwen Ye, et al.
0

In this paper, we study a novel task that enables partial knowledge transfer from pre-trained models, which we term as Partial Network Cloning (PNC). Unlike prior methods that update all or at least part of the parameters in the target network throughout the knowledge transfer process, PNC conducts partial parametric "cloning" from a source network and then injects the cloned module to the target, without modifying its parameters. Thanks to the transferred module, the target network is expected to gain additional functionality, such as inference on new classes; whenever needed, the cloned module can be readily removed from the target, with its original parameters and competence kept intact. Specifically, we introduce an innovative learning scheme that allows us to identify simultaneously the component to be cloned from the source and the position to be inserted within the target network, so as to ensure the optimal performance. Experimental results on several datasets demonstrate that, our method yields a significant improvement of 5 when compared with parameter-tuning based methods. Our code is available at https://github.com/JngwenYe/PNCloning.

READ FULL TEXT
research
11/18/2022

Task Residual for Tuning Vision-Language Models

Large-scale vision-language models (VLMs) pre-trained on billion-level d...
research
03/13/2023

Upcycling Models under Domain and Category Shift

Deep neural networks (DNNs) often perform poorly in the presence of doma...
research
07/02/2020

Learn Faster and Forget Slower via Fast and Stable Task Adaptation

Training Deep Neural Networks (DNNs) is still highly time-consuming and ...
research
08/03/2020

Shape Adaptor: A Learnable Resizing Module

We present a novel resizing module for neural networks: shape adaptor, a...
research
07/17/2022

Learning with Recoverable Forgetting

Life-long learning aims at learning a sequence of tasks without forgetti...
research
06/15/2021

Learning Stable Classifiers by Transferring Unstable Features

We study transfer learning in the presence of spurious correlations. We ...
research
10/30/2022

Dataset Distillation via Factorization

In this paper, we study \xw{dataset distillation (DD)}, from a novel per...

Please sign up or login with your details

Forgot password? Click here to reset