Faster Optimization-Based Meta-Learning Adaptation Phase

06/13/2022
by   Kostiantyn Khabarlak, et al.
0

Neural networks require a large amount of annotated data to learn. Meta-learning algorithms propose a way to decrease the number of training samples to only a few. One of the most prominent optimization-based meta-learning algorithms is Model-Agnostic Meta-Learning (MAML). However, the key procedure of adaptation to new tasks in MAML is quite slow. In this work we propose an improvement to MAML meta-learning algorithm. We introduce Lambda patterns by which we restrict which weight are updated in the network during the adaptation phase. This makes it possible to skip certain gradient computations. The fastest pattern is selected given an allowed quality degradation threshold parameter. In certain cases, quality improvement is possible by a careful pattern selection. The experiments conducted have shown that via Lambda adaptation pattern selection, it is possible to significantly improve the MAML method in the following areas: adaptation time has been decreased by a factor of 3 with minimal accuracy loss; accuracy for one-step adaptation has been substantially improved.

READ FULL TEXT

page 6

page 7

research
12/05/2018

The effects of negative adaptation in Model-Agnostic Meta-Learning

The capacity of meta-learning algorithms to quickly adapt to a variety o...
research
02/07/2021

Meta-Learning with Neural Tangent Kernels

Model Agnostic Meta-Learning (MAML) has emerged as a standard framework ...
research
04/25/2019

Faster and More Accurate Learning with Meta Trace Adaptation

Learning speed and accuracy are of universal interest for reinforcement ...
research
06/08/2018

Adversarial Meta-Learning

Meta-learning enables a model to learn from very limited data to underta...
research
06/22/2020

Siamese Meta-Learning and Algorithm Selection with 'Algorithm-Performance Personas' [Proposal]

Automated per-instance algorithm selection often outperforms single lear...
research
02/01/2023

Efficient Meta-Learning via Error-based Context Pruning for Implicit Neural Representations

We introduce an efficient optimization-based meta-learning technique for...
research
02/21/2021

Fast On-Device Adaptation for Spiking Neural Networks via Online-Within-Online Meta-Learning

Spiking Neural Networks (SNNs) have recently gained popularity as machin...

Please sign up or login with your details

Forgot password? Click here to reset