Attentive Recurrent Comparators

03/02/2017
by   Pranav Shyam, et al.
0

Rapid learning requires flexible representations to quickly adopt to new evidence. We develop a novel class of models called Attentive Recurrent Comparators (ARCs) that form representations of objects by cycling through them and making observations. Using the representations extracted by ARCs, we develop a way of approximating a dynamic representation space and use it for one-shot learning. In the task of one-shot classification on the Omniglot dataset, we achieve the state of the art performance with an error rate of 1.5%. This represents the first super-human result achieved for this task with a generic model that uses only pixel information.

READ FULL TEXT

page 4

page 5

research
03/29/2022

Integrative Few-Shot Learning for Classification and Segmentation

We introduce the integrative task of few-shot classification and segment...
research
11/25/2019

Fast and Generalized Adaptation for Few-Shot Learning

The ability of fast generalizing to novel tasks from a few examples is c...
research
12/07/2021

Learning Instance and Task-Aware Dynamic Kernels for Few Shot Learning

Learning and generalizing to novel concepts with few samples (Few-Shot L...
research
05/23/2018

TADAM: Task dependent adaptive metric for improved few-shot learning

Few-shot learning has become essential for producing models that general...
research
05/25/2021

Improving Few-shot Learning with Weakly-supervised Object Localization

Few-shot learning often involves metric learning-based classifiers, whic...
research
03/09/2020

Motion-Attentive Transition for Zero-Shot Video Object Segmentation

In this paper, we present a novel Motion-Attentive Transition Network (M...

Please sign up or login with your details

Forgot password? Click here to reset