Metalearning with Hebbian Fast Weights

07/12/2018
by   Tsendsuren Munkhdalai, et al.
4

We unify recent neural approaches to one-shot learning with older ideas of associative memory in a model for metalearning. Our model learns jointly to represent data and to bind class labels to representations in a single shot. It builds representations via slow weights, learned across tasks through SGD, while fast weights constructed by a Hebbian learning rule implement one-shot binding for each new task. On the Omniglot, Mini-ImageNet, and Penn Treebank one-shot learning benchmarks, our model achieves state-of-the-art results.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/01/2017

Discriminative k-shot learning using probabilistic models

This paper introduces a probabilistic framework for k-shot image classif...
research
02/12/2019

Infinite Mixture Prototypes for Few-Shot Learning

We propose infinite mixture prototypes to adaptively represent both simp...
research
06/20/2018

Uncertainty in Multitask Transfer Learning

Using variational Bayes neural networks, we develop an algorithm capable...
research
05/24/2019

Learning to learn by Self-Critique

In few-shot learning, a machine learning system learns from a small set ...
research
07/18/2017

One-Shot Learning in Discriminative Neural Networks

We consider the task of one-shot learning of visual categories. In this ...
research
09/23/2020

Fuzzy Simplicial Networks: A Topology-Inspired Model to Improve Task Generalization in Few-shot Learning

Deep learning has shown great success in settings with massive amounts o...
research
06/07/2021

One-shot learning of paired associations by a reservoir computing model with Hebbian plasticity

One-shot learning can be achieved by algorithms and animals, but how the...

Please sign up or login with your details

Forgot password? Click here to reset