How to represent part-whole hierarchies in a neural network

02/25/2021
by   Geoffrey Hinton, et al.
34

This paper does not describe a working system. Instead, it presents a single idea about representation which allows advances made by several different groups to be combined into an imaginary system called GLOM. The advances include transformers, neural fields, contrastive representation learning, distillation and capsules. GLOM answers the question: How can a neural network with a fixed architecture parse an image into a part-whole hierarchy which has a different structure for each image? The idea is simply to use islands of identical vectors to represent the nodes in the parse tree. If GLOM can be made to work, it should significantly improve the interpretability of the representations produced by transformer-like systems when applied to vision or language

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/25/2022

MaskCLIP: Masked Self-Distillation Advances Contrastive Language-Image Pretraining

This paper presents a simple yet effective framework MaskCLIP, which inc...
research
08/25/2021

The Next 700 Program Transformers

In this paper, we describe a hierarchy of program transformers in which ...
research
04/20/2023

An Introduction to Transformers

The transformer is a neural network component that can be used to learn ...
research
10/06/2022

Transformers Can Be Expressed In First-Order Logic with Majority

Characterizing the implicit structure of the computation within neural n...
research
03/23/2022

What to Hide from Your Students: Attention-Guided Masked Image Modeling

Transformers and masked language modeling are quickly being adopted and ...
research
04/18/2023

Hyperbolic Image-Text Representations

Visual and linguistic concepts naturally organize themselves in a hierar...
research
01/25/2019

Equivariant Transformer Networks

How can prior knowledge on the transformation invariances of a domain be...

Please sign up or login with your details

Forgot password? Click here to reset