Task-similarity Aware Meta-learning through Nonparametric Kernel Regression

06/12/2020
by   Arun Venkitaraman, et al.
0

Meta-learning refers to the process of abstracting a learning rule for a class of tasks through a meta-parameter that captures the inductive bias for the class. The metaparameter is used to achieve a fast adaptation to unseen tasks from the class, given a few training samples. While meta-learning implicitly assumes the tasks as being similar, it is generally unclear how this similarity could be quantified. Further, many of the popular meta-learning approaches do not actively use such a task-similarity in solving for the tasks. In this paper, we propose the task-similarity aware nonparameteric meta-learning algorithm that explicitly employs similarity/dissimilarity between tasks using nonparametric kernel regression. Our approach models the task-specific parameters to lie in a reproducing kernel Hilbert space, wherein the kernel function captures the similarity across tasks. The proposed algorithm iteratively learns a meta-parameter which is used to assign a task-specific descriptor for every task. The task descriptors are then used to quantify the similarity through the kernel function. We show how our approach generalizes the popular meta-learning approaches of model-agnostic meta-learning (MAML) and Meta-stochastic gradient descent (Meta-SGD) approaches. Numerical experiments with regression tasks show that our algorithm performs well even in the presence of outlier or dissimilar tasks, validating the proposed approach

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/09/2020

Local Nonparametric Meta-Learning

A central goal of meta-learning is to find a learning rule that enables ...
research
06/17/2019

Coupling Retrieval and Meta-Learning for Context-Dependent Semantic Parsing

In this paper, we present an approach to incorporate retrieved datapoint...
research
12/14/2018

Online gradient-based mixtures for transfer modulation in meta-learning

Learning-to-learn or meta-learning leverages data-driven inductive bias ...
research
06/03/2022

Dynamic Kernel Selection for Improved Generalization and Memory Efficiency in Meta-learning

Gradient based meta-learning methods are prone to overfit on the meta-tr...
research
06/05/2019

Noise Contrastive Meta-Learning for Conditional Density Estimation using Kernel Mean Embeddings

Current meta-learning approaches focus on learning functional representa...
research
02/12/2020

Distribution-Agnostic Model-Agnostic Meta-Learning

The Model-Agnostic Meta-Learning (MAML) algorithm <cit.> has been celebr...
research
03/05/2020

PAC-Bayesian Meta-learning with Implicit Prior

We introduce a new and rigorously-formulated PAC-Bayes few-shot meta-lea...

Please sign up or login with your details

Forgot password? Click here to reset