Knowledge Flow: Improve Upon Your Teachers

04/11/2019
by   Iou-Jen Liu, et al.
18

A zoo of deep nets is available these days for almost any given task, and it is increasingly unclear which net to start with when addressing a new task, or which net to use as an initialization for fine-tuning a new model. To address this issue, in this paper, we develop knowledge flow which moves 'knowledge' from multiple deep nets, referred to as teachers, to a new deep net model, called the student. The structure of the teachers and the student can differ arbitrarily and they can be trained on entirely different tasks with different output spaces too. Upon training with knowledge flow the student is independent of the teachers. We demonstrate our approach on a variety of supervised and reinforcement learning tasks, outperforming fine-tuning and other 'knowledge exchange' methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/26/2022

AMF: Adaptable Weighting Fusion with Multiple Fine-tuning for Image Classification

Fine-tuning is widely applied in image classification tasks as a transfe...
research
05/20/2023

DisCo: Distilled Student Models Co-training for Semi-supervised Text Mining

Many text mining models are constructed by fine-tuning a large deep pre-...
research
02/10/2021

Transfer Reinforcement Learning across Homotopy Classes

The ability for robots to transfer their learned knowledge to new tasks ...
research
05/25/2019

Efficient Neural Task Adaptation by Maximum Entropy Initialization

Transferring knowledge from one neural network to another has been shown...
research
09/04/2018

VideoMatch: Matching based Video Object Segmentation

Video object segmentation is challenging yet important in a wide variety...
research
07/09/2020

Patient-Specific Domain Adaptation for Fast Optical Flow Based on Teacher-Student Knowledge Transfer

Fast motion feedback is crucial in computer-aided surgery (CAS) on movin...

Please sign up or login with your details

Forgot password? Click here to reset