Hierarchical Variational Memory for Few-shot Learning Across Domains

12/15/2021
by   Yingjun Du, et al.
9

Neural memory enables fast adaptation to new tasks with just a few training samples. Existing memory models store features only from the single last layer, which does not generalize well in presence of a domain shift between training and test distributions. Rather than relying on a flat memory, we propose a hierarchical alternative that stores features at different semantic levels. We introduce a hierarchical prototype model, where each level of the prototype fetches corresponding information from the hierarchical memory. The model is endowed with the ability to flexibly rely on features at different semantic levels if the domain shift circumstances so demand. We meta-learn the model by a newly derived hierarchical variational inference framework, where hierarchical memory and prototypes are jointly optimized. To explore and exploit the importance of different semantic levels, we further propose to learn the weights associated with the prototype at each level in a data-driven way, which enables the model to adaptively choose the most generalizable features. We conduct thorough ablation studies to demonstrate the effectiveness of each component in our model. The new state-of-the-art performance on cross-domain and competitive performance on traditional few-shot classification further substantiates the benefit of hierarchical variational memory.

READ FULL TEXT
research
10/20/2020

Learning to Learn Variational Semantic Memory

In this paper, we introduce variational semantic memory into meta-learni...
research
06/26/2023

ProtoDiff: Learning to Learn Prototypical Networks by Task-Guided Diffusion

Prototype-based meta-learning has emerged as a powerful technique for ad...
research
05/08/2021

MetaKernel: Learning Variational Random Features with Limited Labels

Few-shot learning deals with the fundamental and challenging problem of ...
research
05/17/2023

MetaModulation: Learning Variational Feature Hierarchies for Few-Shot Learning with Fewer Tasks

Meta-learning algorithms are able to learn a new task using previously l...
research
06/05/2021

Meta-Learning with Variational Semantic Memory for Word Sense Disambiguation

A critical challenge faced by supervised word sense disambiguation (WSD)...
research
02/20/2021

Kanerva++: extending The Kanerva Machine with differentiable, locally block allocated latent memory

Episodic and semantic memory are critical components of the human memory...
research
05/06/2018

Context Spaces as the Cornerstone of a Near-Transparent & Self-Reorganizing Semantic Desktop

Existing Semantic Desktops are still reproached for being too complicate...

Please sign up or login with your details

Forgot password? Click here to reset