EMO: Episodic Memory Optimization for Few-Shot Meta-Learning

06/08/2023
by   Yingjun Du, et al.
0

Few-shot meta-learning presents a challenge for gradient descent optimization due to the limited number of training samples per task. To address this issue, we propose an episodic memory optimization for meta-learning, we call EMO, which is inspired by the human ability to recall past learning experiences from the brain's memory. EMO retains the gradient history of past experienced tasks in external memory, enabling few-shot learning in a memory-augmented way. By learning to retain and recall the learning process of past training tasks, EMO nudges parameter updates in the right direction, even when the gradients provided by a limited number of examples are uninformative. We prove theoretically that our algorithm converges for smooth, strongly convex objectives. EMO is generic, flexible, and model-agnostic, making it a simple plug-and-play optimizer that can be seamlessly embedded into existing optimization-based few-shot meta-learning approaches. Empirical results show that EMO scales well with most few-shot classification benchmarks and improves the performance of optimization-based meta-learning methods, resulting in accelerated convergence.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/07/2018

Probabilistic Model-Agnostic Meta-Learning

Meta-learning for few-shot learning entails acquiring a prior over previ...
research
02/07/2019

Adaptive Posterior Learning: few-shot learning with a surprise-based memory module

The ability to generalize quickly from few observations is crucial for i...
research
06/20/2021

Memory Augmented Optimizers for Deep Learning

Popular approaches for minimizing loss in data-driven learning often inv...
research
05/20/2022

BayesPCN: A Continually Learnable Predictive Coding Associative Memory

Associative memory plays an important role in human intelligence and its...
research
09/15/2021

Sign-MAML: Efficient Model-Agnostic Meta-Learning by SignSGD

We propose a new computationally-efficient first-order algorithm for Mod...
research
10/19/2017

Meta-Learning via Feature-Label Memory Network

Deep learning typically requires training a very capable architecture us...
research
06/10/2019

Meta-Learning Neural Bloom Filters

There has been a recent trend in training neural networks to replace dat...

Please sign up or login with your details

Forgot password? Click here to reset