Transferring Knowledge across Learning Processes

by   Sebastian Flennerhag, et al.

In complex transfer learning scenarios new tasks might not be tightly linked to previous tasks. Approaches that transfer information contained only in the final parameters of a source model will therefore struggle. Instead, transfer learning at a higher level of abstraction is needed. We propose Leap, a framework that achieves this by transferring knowledge across learning processes. We associate each task with a manifold on which the training process travels from initialization to final parameters and construct a meta learning objective that minimizes the expected length of this path. Our framework leverages only information obtained during training and can be computed on the fly at negligible cost. We demonstrate that our framework outperforms competing methods, both in meta learning and transfer learning, on a set of computer vision tasks. Finally, we demonstrate that Leap can transfer knowledge across learning processes in demanding Reinforcement Learning environments (Atari) that involve millions of gradient steps.


page 1

page 2

page 3

page 4


Transfer Learning for Algorithm Recommendation

Meta-Learning is a subarea of Machine Learning that aims to take advanta...

Robustifying Sequential Neural Processes

When tasks change over time, meta-transfer learning seeks to improve the...

Merging Models with Fisher-Weighted Averaging

Transfer learning provides a way of leveraging knowledge from one task w...

A Unified Meta-Learning Framework for Dynamic Transfer Learning

Transfer learning refers to the transfer of knowledge or information fro...

Multilingual Speech Recognition using Knowledge Transfer across Learning Processes

Multilingual end-to-end(E2E) models have shown a great potential in the ...

Kernel Modulation: A Parameter-Efficient Method for Training Convolutional Neural Networks

Deep Neural Networks, particularly Convolutional Neural Networks (ConvNe...

MetaNOR: A Meta-Learnt Nonlocal Operator Regression Approach for Metamaterial Modeling

We propose MetaNOR, a meta-learnt approach for transfer-learning operato...

Code Repositories


Transfer Learning library for Deep Neural Networks.

view repo


Original PyTorch implementation of the Leap meta-learner ( along with code for running the Omniglot experiment presented in the paper.

view repo


Meta-Learning with Warped Gradient Descent

view repo


Simple, extensible implementations of some meta-learning algorithms in Jax

view repo