Independent Prototype Propagation for Zero-Shot Compositionality

06/01/2021
by   Frank Ruis, et al.
0

Humans are good at compositional zero-shot reasoning; someone who has never seen a zebra before could nevertheless recognize one when we tell them it looks like a horse with black and white stripes. Machine learning systems, on the other hand, usually leverage spurious correlations in the training data, and while such correlations can help recognize objects in context, they hurt generalization. To be able to deal with underspecified datasets while still leveraging contextual clues during classification, we propose ProtoProp, a novel prototype propagation graph method. First we learn prototypical representations of objects (e.g., zebra) that are conditionally independent w.r.t. their attribute labels (e.g., stripes) and vice versa. Next we propagate the independent prototypes through a compositional graph, to learn compositional prototypes of novel attribute-object combinations that reflect the dependencies of the target distribution. The method does not rely on any external data, such as class hierarchy graphs or pretrained word embeddings. We evaluate our approach on AO-Clever, a synthetic and strongly visual dataset with clean labels, and UT-Zappos, a noisy real-world dataset of fine-grained shoe types. We show that in the generalized compositional zero-shot setting we outperform state-of-the-art results, and through ablations we show the importance of each part of the method and their contribution to the final results.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/25/2020

A causal view of compositional zero-shot recognition

People easily recognize new visual categories that are new combinations ...
research
06/11/2023

Compositional Prototypical Networks for Few-Shot Classification

It is assumed that pre-training provides the feature extractor with stro...
research
12/20/2021

Translational Concept Embedding for Generalized Compositional Zero-shot Learning

Generalized compositional zero-shot learning means to learn composed con...
research
10/20/2022

Learning Attention Propagation for Compositional Zero-Shot Learning

Compositional zero-shot learning aims to recognize unseen compositions o...
research
06/01/2022

Learning Invariant Visual Representations for Compositional Zero-Shot Learning

Compositional Zero-Shot Learning (CZSL) aims to recognize novel composit...
research
04/23/2022

On Leveraging Variational Graph Embeddings for Open World Compositional Zero-Shot Learning

Humans are able to identify and categorize novel compositions of known c...
research
07/14/2020

COBE: Contextualized Object Embeddings from Narrated Instructional Video

Many objects in the real world undergo dramatic variations in visual app...

Please sign up or login with your details

Forgot password? Click here to reset