Object-Centric Image Generation with Factored Depths, Locations, and Appearances

by   Titas Anciukevicius, et al.

We present a generative model of images that explicitly reasons over the set of objects they show. Our model learns a structured latent representation that separates objects from each other and from the background; unlike prior works, it explicitly represents the 2D position and depth of each object, as well as an embedding of its segmentation mask and appearance. The model can be trained from images alone in a purely unsupervised fashion without the need for object masks or depth information. Moreover, it always generates complete objects, even though a significant fraction of training images contain occlusions. Finally, we show that our model can infer decompositions of novel images into their constituent objects, including accurate prediction of depth ordering and segmentation of occluded parts.



page 2

page 3

page 4

page 6

page 9

page 10

page 11

page 13


SeGAN: Segmenting and Generating the Invisible

Objects often occlude each other in scenes; Inferring their appearance b...

GANSeg: Learning to Segment by Unsupervised Hierarchical Image Generation

Segmenting an image into its parts is a frequent preprocess for high-lev...

Unsupervised Object Learning via Common Fate

Learning generative object models from unlabelled videos is a long stand...

Generate What You Can't See - a View-dependent Image Generation

In order to operate autonomously, a robot should explore the environment...

Sequential Attend, Infer, Repeat: Generative Modelling of Moving Objects

We present Sequential Attend, Infer, Repeat (SQAIR), an interpretable de...

StampNet: unsupervised multi-class object discovery

Unsupervised object discovery in images involves uncovering recurring pa...

Geometry-Aware Recurrent Neural Networks for Active Visual Recognition

We present recurrent geometry-aware neural networks that integrate visua...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.