MetaPerturb: Transferable Regularizer for Heterogeneous Tasks and Architectures

06/13/2020
by   Jeongun Ryu, et al.
0

Regularization and transfer learning are two popular techniques to enhance generalization on unseen data, which is a fundamental problem of machine learning. Regularization techniques are versatile, as they are task- and architecture-agnostic, but they do not exploit a large amount of data available. Transfer learning methods learn to transfer knowledge from one domain to another, but may not generalize across tasks and architectures, and may introduce new training cost for adapting to the target task. To bridge the gap between the two, we propose a transferable perturbation, MetaPerturb, which is meta-learned to improve generalization performance on unseen data. MetaPerturb is implemented as a set-based lightweight network that is agnostic to the size and the order of the input, which is shared across the layers. Then, we propose a meta-learning framework, to jointly train the perturbation function over heterogeneous tasks in parallel. As MetaPerturb is a set-function trained over diverse distributions across layers and tasks, it can generalize to heterogeneous tasks and architectures. We validate the efficacy and generality of MetaPerturb trained on a specific source domain and architecture, by applying it to the training of diverse neural architectures on heterogeneous target datasets against various regularizers and fine-tuning. The results show that the networks trained with MetaPerturb significantly outperform the baselines on most of the tasks and architectures, with a negligible increase in the parameter size and no hyperparameters to tune.

READ FULL TEXT
research
05/15/2019

Learning What and Where to Transfer

As the application of deep learning has expanded to real-world problems ...
research
05/20/2022

Set-based Meta-Interpolation for Few-Task Meta-Learning

Meta-learning approaches enable machine learning systems to adapt to new...
research
11/03/2020

Meta-learning Transferable Representations with a Single Target Domain

Recent works found that fine-tuning and joint training—two popular appro...
research
07/14/2020

Adversarially-Trained Deep Nets Transfer Better

Transfer learning has emerged as a powerful methodology for adapting pre...
research
06/06/2021

DAMSL: Domain Agnostic Meta Score-based Learning

In this paper, we propose Domain Agnostic Meta Score-based Learning (DAM...
research
12/04/2020

Model-Agnostic Learning to Meta-Learn

In this paper, we propose a learning algorithm that enables a model to q...
research
05/25/2023

Meta Adaptive Task Sampling for Few-Domain Generalization

To ensure the out-of-distribution (OOD) generalization performance, trad...

Please sign up or login with your details

Forgot password? Click here to reset